首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 35 毫秒
1.
Understanding and quantifying the learning–forgetting process helps predict the performance of an individual (or a group of individuals), estimate labor costs, bid on new and repeated orders, estimate costs of strikes, schedule production, develop training programs, set time standards, and improve work methods [IIE Trans. 29 (1997) 759]. Although there is agreement that the form of the learning curve is as presented by [J. Aeronaut. Sci. 3 (1936) 122], scientists and practitioners have not yet developed a full understanding of the behavior and factors affecting the forgetting process. The paucity of research on forgetting curves has been attributed to the practical difficulties involved in obtaining data concerning the level of forgetting as a function of time [IIE Transactions 21 (1989) 376]. The learn–forget curve model (LFCM) was shown to have many advantages over other theoretical models that capture the learning–forgetting relationship. However, the deficiency of the LFCM is in the assumption that the time for total forgetting is invariant of the experience gained prior to interruption. This paper attempts to correct this deficiency by incorporating the findings of [Int. J. Ind. Ergon. 10 (1992) 217] into the LFCM. Numerical examples are used to illustrate the behavior of the modified LFCM (MLFCM) and compare results to those of the LFCM.  相似文献   

2.
The limitations of frequency-domain filtering methods have motivated the development of alternative techniques, in which a filter is applied to a time–frequency distribution instead of the Fourier spectrum. One such distribution is the S-transform, a modified short-time Fourier transform whose window scales with frequency, as in wavelets. Recently it has been shown that the S-transform's local spectra have time-domain equivalents. Since each of these is associated with a particular window position on the time axis, collectively they give a time–time distribution. This distribution, called the TT-transform, exhibits differential concentration of different frequency components, with higher frequencies being more strongly concentrated around the localization position than lower frequencies. This leads to the idea of filtering on the time–time plane, in addition to the time–frequency plane. Examples of time–frequency filtering and time–time filtering are presented.  相似文献   

3.
Neural Computing and Applications - Lean manufacturing is a systematic method to improve productivity and reduce costs for each industrial. One of the tools for lean is the cellular manufacturing...  相似文献   

4.
The paper analyzes a manufacturing system made up of one workstation which is able to produce concurrently a number of product types with controllable production rates in response to time-dependent product demands. Given a finite planning horizon, the objective is to minimize production cost, which is incurred when the workstation is not idle and inventory and backlog costs, which are incurred when the meeting of demand results in inventory surpluses and shortages. With the aid of the maximum principle, optimal production regimes are derived and continuous-time scheduling is reduced to a combinatorial problem of sequencing and timing the regimes. The problem is proved to be polynomially solvable if demand does not exceed the capacity of the workstation or it is steadily pressing and the costs are “agreeable”.

Scope and purpose

Efficient utilization of modern flexible manufacturing systems is heavily dependent on proper scheduling of products throughout the available facilities. Scheduling of a workstation which produces concurrently a number of product types with controllable production rates in response to continuous, time-dependent demand is under consideration. Similar to the systems considered by many authors in recent years, a buffer with unlimited capacity is placed after the workstation for each product type. The objective is to minimize inventory storage, backlog and production costs over a finite planning horizon. Numerical approaches are commonly used to approximate the optimal solution for similar problems. The key contribution of this work is that the continuous-time scheduling problem is reduced to a combinatorial problem, exactly solvable in polynomial time if demand does not exceed the capacity of the workstation or the manufacturing system is organized such that the early production and storage of a product to reduce later backlogs are justified.  相似文献   

5.
为了解决以往相似零件工时估算方法的精度和效率之间的矛盾,提出一种精度高且易于编程实现进而提高效率的工时估算方法。研究加工工艺对制造特征工时的影响机理,提出制造特征组合的概念,并在此基础上实现生产准备、上下料、换刀、粗加工、精加工五个工时数据库的开发。提出制造流程图的概念并描述其绘制方法。通过对历史零件制造流程图的修改及工时数据库中数据的调取计算实现对相似零件的工时估算。以某企业生产的一个零件族中相似零件的工时估算为例,验证所提出的估算方法的有效性。  相似文献   

6.
Computational Grids and peer‐to‐peer (P2P) networks enable the sharing, selection, and aggregation of geographically distributed resources for solving large‐scale problems in science, engineering, and commerce. The management and composition of resources and services for scheduling applications, however, becomes a complex undertaking. We have proposed a computational economy framework for regulating the supply of and demand for resources and allocating them for applications based on the users' quality‐of‐service requirements. The framework requires economy‐driven deadline‐ and budget‐constrained (DBC) scheduling algorithms for allocating resources to application jobs in such a way that the users' requirements are met. In this paper, we propose a new scheduling algorithm, called the DBC cost–time optimization scheduling algorithm, that aims not only to optimize cost, but also time when possible. The performance of the cost–time optimization scheduling algorithm has been evaluated through extensive simulation and empirical studies for deploying parameter sweep applications on global Grids. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

7.
Since 1918, hand–arm vibration (HAV) exposure, principally from but not limited to vibrating power tools and processes, affects some 1.5 to 2 million regularly exposed U.S. workers and many more worldwide. These HAV exposures usually lead to an irreversible disease of the fingers/hands called hand–arm vibration syndrome whose prevalence is as high as 50% in exposed worker populations. HAVS results not only in hand–arm deterioration, but invariably job loss. To help combat the mounting HAV problem, domestic and international consensus HAV exposure standards were developed and promulgated in the early 1980s; but for the first time, the European Union in 2005 passed into law exposure standards for both HAV and whole‐body vibration. In response, in 2006 in the United States, the American National Standards Institute (ANSI) developed, replaced, and promulgated its 1986 HAV exposure standard S3.34 with a completely revised HAV standard—S2.70‐2006—thus ushering in new profound implications for power tool users and tool manufacturers and countless related manufacturing operations throughout the United States. The background, salient aspects, safety and health, and manufacturing implications of this new ANSI S2.70 HAV standard are discussed. © 2008 Wiley Periodicals, Inc.  相似文献   

8.
Lean Manufacturing—often simply referred to as “Lean”—is a process management philosophy that aims to improve the way in which products are manufactured. It does this through identifying and removing waste and creating a smooth transition between stages in the production process. To a large extent, it relies on visual and simple mechanical aids to assist in improving manufacturing effectiveness. However, when it comes to combining several aspects of Lean or when dealing with complex environments, quantitative modelling becomes essential to achieve the full benefits of Lean.  相似文献   

9.
Quality function deployment (QFD) is becoming a widely used customer-oriented approach and tool in product design. Taking into account the financial factors and uncertainties in the product design process, this paper deals with a fuzzy formulation combined with a genetic-based interactive approach to QFD planning. By introducing new concepts of planned degree, actual achieved degree, actual primary costs required and actual planned costs, two types of fuzzy optimisation models are discussed in this paper. These models consider not only the overall customer satisfaction, but also the enterprise satisfaction with the costs committed to the product. With the interactive approach, the best balance between enterprise satisfaction and overall customer satisfaction can be obtained, and the preferred solutions under different business criteria can be achieved through human–computer interaction.Scope and PurposeQuality function deployment (QFD) that originated in Japan in the late 1960s is a concept and mechanism for translating the ‘voice of customer’ into product through various stages of product planning, engineering and manufacturing. It has become a widely used customer-oriented approach to facilitating product design by analysing customer requirements (CRs). Determination of the target levels for the technical attributes (TAs) of a product with a view to achieving a high level of overall customer satisfaction is an important activity in product design and development.Traditional methods for QFD planning are mainly subjective, ad hoc and heuristic. They can hardly achieve global optimisation, and most of these models barely take into consideration the correlation between TAs. Moreover, most of these methods are technically one-sided without considering the design budget. However, the financial factor is also an important factor and should not be neglected in QFD planning. In addition, owing to uncertainties involved in the decision process, these deterministic methods could not formulate and solve it effectively.Taking into consideration the financial factors and uncertainties in the product design process, this paper deals with fuzzy formulation combined with a genetic-based interactive approach to QFD planning. By introducing new concepts of planned degree, actual achieved degree, actual primary costs required and actual planned costs, two types of fuzzy optimisation models are discussed in this paper. These models consider not only the overall customer satisfaction, but also the enterprise satisfaction with the costs committed to the product. With the interactive approach, the best balance between enterprise satisfaction and overall customer satisfaction can be obtained, and the preferred solutions under different business criteria can be achieved through human–computer interaction.  相似文献   

10.
针对大规模定制环境下产品制造时间受定制因素影响而无法准确获得的问题, 对定制产品制造时间的制定进行了探索研究。基于模块化思想将产品制造时间按层级划分为特征时间模块(物理时间模块、逻辑时间模块)和子特征时间模块(作业时间模块、定制时间模块), 采用基于案例推理技术确定作业时间模块定额, 分析定制因素对制造时间的影响机理并构建定制时间模块定额模型, 通过对特征时间模块的集成研究得到产品制造时间的数学模型。最后, 以某公司G变压器的制造为例验证了该模块化制造时间制定思路和方法的正确性。  相似文献   

11.
Due to the gloom of global economics, competition among manufacturing firms is getting even tougher. Lean production, agile manufacturing, total quality management, and World‐Class philosophies are not providing robust solutions that enable those companies to face such a turbulent environment. This ongoing research work proposes a framework that consists of a set of strategies that enable companies not only to maintain their market share but also to go beyond. It also focuses on speeding‐up time to market as a key driver for business competitiveness. © 2000 John Wiley & Sons, Inc.  相似文献   

12.
Selecting the order of an input–output model of a dynamical system is a key step toward the goal of system identification. The false nearest neighbors algorithm (FNN) is a useful tool for the estimation of the order of linear and nonlinear systems. While advanced FNN uses nonlinear input–output data-based models for the model-based selection of the threshold constant that is used to compute the percentage of false neighbors, the computational effort of the method increases along with the number of data and the dimension of the model. To increase the efficiency of this method, in this paper we propose a clustering-based algorithm. Clustering is applied to the product space of the input and output variables. The model structure is then estimated on the basis of the cluster covariance matrix eigenvalues. The main advantage of the proposed solution is that it is model-free. This means that no particular model needs to be constructed in order to select the order of the model, while most other techniques are ‘wrapped' around a particular model construction method. This saves the computational effort and avoids a possible bias due to the particular construction method used. Three simulation examples are given to illustrate the proposed technique: estimation of the model structure for a linear system, a polymerization reactor and the van der Vusse reactor.  相似文献   

13.
A model for preventive maintenance operations and forecasting   总被引:2,自引:1,他引:1  
Equipment costs constitute the greatest majority of overall costs for semiconductor manufacturing. Therefore, maintaining high equipment availability has been regarded as one of the major goals in the industry. The ability to forecast correctly equipment preventive maintenance (PM) timing requirements not only can help optimizing equipment uptime but also minimizing negative impacts on manufacturing production efficiency. This research used grey theory and evaluation diagnosis to construct a PM forecasting model for prediction of PM timing of various machines. The results showed significant improvements of PM timing predictions compared to the existing method based on experience and an alternative method proposed by Li and Chang (Semiconductor Manufacturing Technology Workshop 2002: 10–11, pp. 275–277) for the same fab cases. Received: June 2005 / Accepted: December 2005  相似文献   

14.
Many manufacturing companies experience increasing product variety, which not only leads to significantly increased operations’ costs but also to an uneven distribution of these costs across product variants. To deal with this challenge, managers need to accurately quantify the costs of product variety-induced complexity to understand its impact on company revenues. However, quantifying complexity costs is no trivial task, as both the variety-related factors that generate more complexity and the costs implied by these factors are specific to each company. Therefore, we extensively reviewed the relevant literature to build a list of possible complexity cost factors generated by product variety, which are here named variety-induced complexity cost factors (VCCFs). This list is intended to be used by managers as a reference set of VCCFs to help identify and subsequently quantify the costs of product variety-related complexity in manufacturing companies and finally attribute them to end product variants. To evaluate the usefulness of this list, the identified VCCFs were examined in six companies using both the judgment of managers and data from their enterprise resource planning systems. This empirical examination not only provides evidence of the usefulness of this list but also data, rarely available in academic literature, on the costs of product variety-induced complexity and its proportion of company revenue. It also suggests that it is important to research effective ways to get, elaborate, and present data to calculate the costs of product variety-related complexity and reports a viable approach used in six companies.  相似文献   

15.

The very raison d’être of cyber threat intelligence (CTI) is to provide meaningful knowledge about cyber security threats. The exchange and collaborative generation of CTI by the means of sharing platforms has proven to be an important aspect of practical application. It is evident to infer that inaccurate, incomplete, or outdated threat intelligence is a major problem as only high-quality CTI can be helpful to detect and defend against cyber attacks. Additionally, while the amount of available CTI is increasing it is not warranted that quality remains unaffected. In conjunction with the increasing number of available CTI, it is thus in the best interest of every stakeholder to be aware of the quality of a CTI artifact. This allows for informed decisions and permits detailed analyses. Our work makes a twofold contribution to the challenge of assessing threat intelligence quality. We first propose a series of relevant quality dimensions and configure metrics to assess the respective dimensions in the context of CTI. In a second step, we showcase the extension of an existing CTI analysis tool to make the quality assessment transparent to security analysts. Furthermore, analysts’ subjective perceptions are, where necessary, included in the quality assessment concept.

  相似文献   

16.
Multi‐touch driven user interfaces are becoming increasingly prevalent because of their intuitiveness and because of the reduction in the associated hardware costs. In recognition of this trend, multi‐touch software frameworks (MSFs) have begun to emerge. These frameworks abstract the low level issues of multi‐touch software development and deployment. MSFs therefore enable software developers who are unfamiliar with the complexities of multi‐touch software development to implement and deploy multi‐touch applications more easily. However, some multi‐touch applications have real‐time system requirements, and at present, no MSFs provide support for the development and deployment of such real‐time multi‐touch applications. The implication of this is that software developers are unable to take advantage of MSFs and, therefore, are forced to handle the complexities of multi‐touch and real‐time systems development and deployment for themselves in an ad hoc manner. The primary consequence of this is that the multi‐touch and/or real‐time aspects of the application may not function correctly. In this paper, guidelines are presented for applying real‐time system concepts to support the development and deployment of real‐time multi‐touch applications using MSFs. This serves to increase the probability that the application will meet its timing requirements while also reducing the complexity of the development and deployment process associated with multi‐touch applications. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

17.
The Lean Manufacturing approach requires advanced and efficient manufacturing technologies in order to meet customer demands. Manufacturing companies have increased their productivity and efficiency over time by implementing new strategies, business processes and IT solutions. Best practices also allow companies to achieve on-demand manufacturing through the integration of pull flow production strategy. In order to achieve agility to meet business needs, a key for success is a flexible integration of different information system components to enable the flow of exchanged data and information. To insure the agility of an enterprise’s organization, ISA S-95 standard can be used to determine which information has to be exchanged between system components. We propose an Industrial System Integration Architecture, a Lean Enterprise Service Bus which relies on Lean Manufacturing constraints based on semantic aspects with respect to the ISA S-95 standard. This architecture aims at enhancing the interoperability between the production system and the global enterprise information system in terms of business and manufacturing requirements and establishes semantic interoperability to achieve an industrial semantic.  相似文献   

18.
Lean approaches to product development (LPD) have had a strong influence on many industries and in recent years there have been many proponents for lean in software development as it can support the increasing industry need of scaling agile software development. With it's roots in industrial manufacturing and, later, industrial product development, it would seem natural that LPD would adapt well to large-scale development projects of increasingly software-intensive products, such as in the automotive industry. However, it is not clear what kind of experience and results have been reported on the actual use of lean principles and practices in software development for such large-scale industrial contexts. This was the motivation for this study as the context was an ongoing industry process improvement project at Volvo Car Corporation and Volvo Truck Corporation.  相似文献   

19.
为提高定制企业工时定额制定的效率以满足顾客对产品交货期的需求,提出一种新的工时定额计算方法。该方法以产品族的结构组成为基础,将产品族加工时间分为静态工时、柔性工时、特有工时三大工时模块,提出了时间模块的概念,并运用可拓变换的思想对柔性工时模块进行了可拓学描述,运用工艺聚类分析实现了各工时模块时间的计算。通过对制造对象物元的三级匹配搜索和事元的匹配搜索提取相似的制造对象,以实现加工时间的快速重用。最后通过实例验证了方法的可行性和有效性。  相似文献   

20.
The main objective of this paper is to present an approach to accomplish verification in the early design phases of a system, which allows us to make the system verification easier, specifically for those systems with timing restrictions. For this purpose we use RT‐UML sequence diagrams in the design phase and we translate these diagrams into timed automata for performing the verification by using model checking techniques. Specifically, we use the Object Management Group's UML Profile for Schedulability, Performance, and Time and from the specifications written using this profile we obtain the corresponding timed automata. The ‘RT‐UML Profile’ is used in conjunction with a very well‐known tool to perform validation and verification of the timing needs, namely, the UPPAAL tool, which is used to simulate and analyze the behaviour of real‐time dynamic systems described by timed automata. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号