共查询到20条相似文献,搜索用时 0 毫秒
1.
Maria C. Yang 《Research in Engineering Design》2009,20(1):1-11
The generation of ideas is an essential element in the process of design. One suggested approach to improving the quality
of ideas is through increasing their quantity. In this study, concept generation is examined via brainstorming, morphology
charts and sketching. Statistically significant correlations were found between the quantity of brainstormed ideas and design
outcome. In some, but not all, experiments, correlations were found between the quantity of morphological alternatives and
design outcome. This discrepancy between study results hints at the role of project length and difficulty in design. The volume
of dimensioned drawings generated during the early-to-middle phases of design were found to correlate with design outcome,
suggesting the importance of concrete sketching, timing and milestones in the design process. 相似文献
2.
Z. Ayag 《国际生产研究杂志》2013,51(4):687-713
As many companies have demonstrated over time, a conceptual design for a new product contributes greatly to an improvement in competitiveness, because it permits a reduction of costs, an increase in quality, and, often, a shortening of the time necessary to get the product on the market. That is why the evaluation of conceptual design alternatives created in a new product development (NPD) environment has long been very vital and a critical point for the future success of companies in fast-growing markets. These alternatives can be evaluated using the analytic hierarchy process (AHP), one of the most commonly used multiple criteria decision making (MCDM) techniques. This technique is used to reduce the number of conceptual design alternatives after ranking them using the scores obtained from the process. Furthermore, another technique, simulation analysis integrated with the AHP, is also used to help decision makers (product engineers or managers) perform economic analyses of the AHP's high-score alternatives using data from the generated simulation model of a real-life manufacturing system in which the final alternative is produced. In short, the objectives of this research are: first, to use the AHP technique to evaluate conceptual design alternatives in a NPD environment; second, to use a simulation generator integrated with this technique in order to perform economic analyses for the AHP's high-score alternatives. Finally, the results of both techniques, simulation and AHP, are used in a benefit/cost (B/C) analysis to reach a final decision on the conceptual design alternatives. 相似文献
3.
Kevin N. Otto 《Research in Engineering Design》1995,7(2):86-101
Among the many tasks designers must perform, evaluation of product options based on performance criteria is fundamental. Yet I have found that the methods commonly used remain controversial and uncertain among those who apply them. In this paper, I apply mathematical measurement theory to analyze and clarify common design methods. The methods can be analyzed to determine the level of information required and the quality of the answer provided. Most simple, a method using an ordinal scale only arranges options based on a performance objective. More complex, an interval scale also indicates the difference in performance provided. To construct an interval scale, the designer must provide two basic a priori items of information. First, a base-pointdesign is required from which the remaining designs are relatively measured. Second, the deviation of each remaining design is compared from the base point design using a metricdatum design. Given these two datums, any other design can be evaluated numerically. I show that concept selection charts operate with interval scales. After an interval scale, the next more complex scale is a ratio scale, where the objective has a well-defined zero value. I show that QFD methods operate with ratio scales. Of all measurement scales, the most complex are extensively measurable scales. Extensively measurable scales have a well defined base value, metric value and a concatenation operation for adding values. I show that standard optimization methods operate with extensively measurable scales. Finally, it is also possible to make evaluations with non-numeric scales. These may be more convenient, but are no more general. 相似文献
4.
The Pugh Controlled Convergence method: model-based evaluation and implications for design theory 总被引:2,自引:2,他引:0
Daniel D. Frey Paulien M. Herder Ype Wijnia Eswaran Subrahmanian Konstantinos Katsikopoulos Don P. Clausing 《Research in Engineering Design》2009,20(1):41-58
This paper evaluates the Pugh Controlled Convergence method and its relationship to recent developments in design theory.
Computer executable models are proposed simulating a team of people involved in iterated cycles of evaluation, ideation, and
investigation. The models suggest that: (1) convergence of the set of design concepts is facilitated by the selection of a
strong datum concept; (2) iterated use of an evaluation matrix can facilitate convergence of expert opinion, especially if
used to plan investigations conducted between matrix runs; and (3) ideation stimulated by the Pugh matrices can provide large
benefits both by improving the set of alternatives and by facilitating convergence. As a basis of comparison, alternatives
to Pugh’s methods were assessed such as using a single summary criterion or using a Borda count. These models suggest that
Pugh’s method, under a substantial range of assumptions, results in better design outcomes than those from these alternative
procedures. 相似文献
5.
This paper describes a proposal for a multi-material design procedure. First, the context of the study and the requirements of the multi-material must be clearly defined in order to specify the parameters that the designer must select or optimise in order to produce the design: the components and their volume fraction, the architecture and morphology at different scales, etc. The general design procedure proposed here starts with the reasons why the designer has turned to multi-materials, from which a multi-material concept with fixed parameters can be defined. In this first stage the design problem can be made less complex by reducing the number of unknown parameters and guiding the designer towards the appropriate selection or optimisation tools: (i) subdivision of requirements, guided by applying statistical analysis tools to the materials database to search for appropriate multi-material components, (ii) tools to filter the materials database and search for multi-material components and their volume fraction, (iii) optimisation tools to search for the appropriate architecture when components are known or to search for architecture and components simultaneously. The paper demonstrates how these tools can be applied to different design concepts. 相似文献
6.
Although methods of product design and materials selection have been developed, the recent rise of multimaterials (composites, hybrids, etc.) has shown a lack of tools for engineering in this domain. The present contribution aims at the development of a multimaterials design method based on the adaptation of existing methods and on the conclusions of a case study. In comparison with materials selection, the originality of multimaterials design relates to searching materials coupling mechanisms to satisfy a set of requirements. More, it appears that manufacturing process and validation tests are very important points that must be integrated early in the design process. 相似文献
7.
The concept of combinatorial process, equipment and plant design is introduced and developed for the specific examples of fluid separations and crystallisation. It is shown that traditional methods of process design may miss options that are identified using the combinatorial approach. New options may lead to novel types of processes and equipment. Application of this methodology is suggested in terms of scanning the multi-dimensional space describing the process, equipment and plant attributes. The new approach is particularly appropriate for the design of agile plants for families of products and where decisions have to be made as how best to re-configure an existing facility to manufacture a new product. 相似文献
8.
用户的感性诉求日益获得产品设计师和企业决策者的关注,为了使产品开发人员在开发新产品时能满足用户多维复合的感性诉求,提出一种结合熵处理、灰色关联分析和逼近理想解排序(technique for order preference by similarity to an ideal solution,TOPSIS)的方法,用于产品设计方案的评估。首先,在运用K-mean聚类分析和因子分析方法分别获得符合用户感性认知的典型样本和典型感性意象的基础上,运用语意差异法获得用户-意象初始评价值;其次,应用信息熵计算评估指标的权重;最后,采用灰色关联分析与TOPSIS相结合的方法实现用户多维意象特定要求下对备选设计方案的排序优选。以3D打印机设计方案优选为例,验证了方法的可行性。研究结果表明,该方法符合用户实际的复杂认知,能有效客观地对产品设计方案进行评估和优选,具有实际应用价值和指导作用。 相似文献
9.
舰载武器系统研制方案的生成、评价与优选问题是武器装备研制论证中的重要内容.在分析目前舰载武器系统研制方案论证方法的基础上,构建了舰载武器系统多目标优化模型,提出了舰载武器系统研制方案的多目标生成方法,并建立了不完全偏好信息的舰载武器系统研制方案的多目标评价与优选模型.实例分析说明,该模型与方法是有效的,可为武器装备研制论证问题提供辅助决策方法和智力支持. 相似文献
10.
An integrated computer-aided system for generation and evaluation of sustainable process alternatives 总被引:1,自引:0,他引:1
Niels Jensen Nuria Coll Rafiqul Gani 《Clean Technologies and Environmental Policy》2003,5(3-4):209-225
This paper presents an integrated system for generation of sustainable process alternatives with respect to new process design as well as retrofit design. The generated process alternatives are evaluated through sustainability metrics, environmental impact factors as well as inherent safety factors. The process alternatives for new process design as well as retrofit design are generated through a systematic method that is simple yet effective and is based on a recently developed path flow analysis approach. According to this approach, a set of indicators are calculated in order to pinpoint unnecessary energy and material waste costs and to identify potential design (retrofit) targets that may improve the process design (in terms of operation and cost) simultaneously with the sustainability metrics, environmental impact factors and the inherent safety factors. Only steady state design data and a database with properties of compounds, including, environmental impact factor related data and safety factor related data are needed. The integrated computer-aided system generates the necessary data if actual plant or experimental data are not available. The application of the integrated system is highlighted through a number of examples including the well-known HDA process. 相似文献
11.
The option generation and selection (OGS) methodology forms part of a general approach for the design of agile chemical plants based on business, product and process knowledge, with support from information models. This paper describes an equipment OGS tool that encompasses the principles of combinatorial process and plant design. The main components of the methodology are: an equipment option generation model described as a set of objects, and the net relationships between them, and an equipment option selection model which consists of procedures for equipment selection. The two models are supported by databases containing information specific to each equipment type, the concept on which the equipment is based, and relationships with other equipment types. Robust, systematic and complete forms of these models can be used as the basis of an expert system for process equipment design, with equipment selected using these tools satisfying the requirements of both specific processes and families of processes (that contain common features, similar functional groups or similar raw material requirements for process operations). Application of the methodology also allows the evaluation of options for reconfiguring existing plant. 相似文献
12.
Professor Chuck Eastman D. Stott Parker Tay-Sheng Jeng 《Research in Engineering Design》1997,9(3):125-145
The purpose of this work is to develop automatic methods of semantic integrity maintenance, in support of concurrent engineering. Semantic integrity relations in any final engineering design are built up incrementally, through the use of different computer applications. Here, the structure of these integrity relations are formalised for representation within a database. When changes to a design have to be made, they can invalidate integrity relations in other parts of the design. Formal methods are defined for identifying what data and integrity relations are invalidated by any change. Methods for making changes that minimise re-design are described and formalised. Opportunities for using semantic integrity to assess progress on a design are reviewed.Research supported by NSF grant IRI-9319982. 相似文献
13.
Philippe d’Anjou 《Design Studies》2011,32(1):45-59
14.
An improved decomposition scheme for assessing the reliability of embedded systems by using dynamic fault trees 总被引:2,自引:0,他引:2
The theories of fault trees have been used for many years because they can easily provide a concise representation of failure behavior of general non-repairable fault tolerant systems. But the defect of traditional fault trees is lack of accuracy when modeling dynamic failure behavior of certain systems with fault-recovery process. A solution to this problem is called behavioral decomposition. A system will be divided into several dynamic or static modules, and each module can be further analyzed using binary decision diagram (BDD) or Markov chains separately. In this paper, we will show a very useful decomposition scheme that independent subtrees of a dynamic module are detected and solved hierarchically. Experimental results show that the proposed method could result in significant saving of computation time without losing unacceptable accuracy. Besides, we also present an analyzing software toolkit: DyFA (dynamic fault-trees analyzer) which implements the proposed methodology. 相似文献
15.
Improving and supporting the process of design knowledge reuse can increase productivity, improve the quality of designs and
lead to corporate competitive advantage. Whereas internal knowledge reuse (reusing knowledge from one’s personal memory or
experiences) is very effective, external knowledge reuse (reusing knowledge from an external digital or paper archive) often
fails. This paper studies the value of the storytelling paradigm in supporting reuse from an external repository. Based on a formalisation of the internal reuse process from ethnographic
studies, a prototype system, Corporate Memory (CoMem) is presented, which supports the reuse process, specifically the steps
of finding and understanding reusable items. This paper focuses on the ability of designers to understand designs that are found in corporate repositories. It is argued that in order to understand and reuse a found design, the
designer needs to see the evolution of that design during the original design process. An Evolution History Explorer module of the CoMem system is presented that uses a storytelling metaphor and lays out versions visually side-by-side. A
formal user evaluation of CoMem supports the hypotheses that (1) exploring the evolution of a design improves the reuse process,
and (2) that visual storytelling is an effective paradigm for supporting that exploration. 相似文献
16.
This paper evaluates judgments of driver crash responsibility to estimate alcohol and drug impairment effects when exposure data are unavailable to calculate crash risks. Previous studies using responsibility judgments provided some evidence that responsibility is related to BAC. Other studies, some inferring responsibility, indicated a relation between responsibility and relative crash risk. Data are presented showing that responsibility judgments with a rating scale have high interrater reliability, and systematic relations with BAC suggest some validity in the ratings. A method is demonstrated for estimating relative crash risk from responsibility judgments with accident data, and the limitations of responsibility analysis are discussed. While alcohol and drug impairment effects are best determined with relative crash risks determined from accident and exposure data. responsibility analysis may provide useful indications in the absence of exposure data. 相似文献
17.
18.
Execution of a complex product development project is facilitated through its decomposition into an interrelated set of localized development tasks. When a local task is completed, its output is integrated through an iterative cycle of system-wide integration activities. Integration is often accompanied by inadvertent information hiding due to the asynchronous information exchanges. We show that information hiding leads to persistent recurrence of problems (termed the design churn effect) such that progress oscillates between being on schedule and falling behind. The oscillatory nature of the PD process confounds progress measurement and makes it difficult to judge whether the project is on schedule or slipping. We develop a dynamic model of work transformation to derive conditions under which churn is observed as an unintended consequence of information hiding due to local and system task decomposition. We illustrate these conditions with a case example from an automotive development project and discuss strategies to mitigate design churn.
相似文献
Ali YassineEmail: Phone: 217-333-8765Fax: 217-244-6165 |
19.
20.
The Exponentially Weighted Moving Average (EWMA) schemes are a potent tool for monitoring small to moderate variations in the quality characteristics in production lines of manufacturing industries. Practitioners in various sectors widely use the EWMA schemes for scrutinising both the variables and attributes. In the present article, we investigate a modified EWMA scheme based on the power of the difference between the actual number of nonconforming items and its technical specification in an in-control (IC) situation. We abbreviate it as a wEWMA scheme and show that the traditional EWMA scheme is a particular case of the proposed scheme when the power is unity. We establish that the powers lower than unity are more effective for detecting smaller shifts, while for detecting substantial variations in process parameter, one should prefer higher powers greater than unity. Noting that possible magnitude of a shift is often unknown, we propose the optimal design procedure of the scheme, including the determination of its charting parameters to ensure the best overall performance. The results reveal that the optimal wEWMA schemes can be beneficial in detecting a shift very quickly when the sample size is small, particularly for high-precision production processes. 相似文献