共查询到20条相似文献,搜索用时 390 毫秒
1.
Andrea Saltelli Marco Ratto Stefano Tarantola Francesca Campolongo European Commission Joint Research Centre of Ispra . 《Reliability Engineering & System Safety》2006,91(10-11):1109-1125
Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having “sensitivity analysis” as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on “one-factor-at-a-time” (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA. 相似文献
2.
Dir. u. Prof. Dr.-Ing. Wolfgang Gerisch Dir. u. Prof.Dr.-Ing. Werner Struck Dir. u. Prof.Dr.-Ing. Beate Wilke 《Forschung im Ingenieurwesen》1992,58(4):77-82
Assuming the statistical model of the random effects one-way layout under the usual conditions (normality and independence)
and considering a realization of this model, the present paper treats the problem of the computation of one-sided tolerance
limits for the corresponding defining random variable on the basis of the Monte Carlo method. Since subpopulations occurring
in this scope can be considered as batches and since the model under consideration is sometimes called ‘cluster sampling model’,
it is obvious that the present problem has important applications in the field of statistical quality control.—The Monte Carlo
approach together with its realization by a computer program for the determination of such tolerance limit factors proved
elementary and effective. In particular, in the important cases of limited sampling information (costs), this approach led
to more realistic values in comparison, with other approaches. An additional problem occurring in connection with simultaneous
statistical inference was solved by means of the Bonferroni inequality. Finally, the paper contains a computer program (FORTRAN
77) the applicability and use of which is shown by numerical examples. 相似文献
3.
功能组合产品是创新产品的一类重要形式,为提高功能组合产品设计的可行性和创新性,提出基于多生物效应的功能组合产品设计方法.首先研究生物原型功能实现的多种因素组合及转化机理,抽取多生物效应来构建多生物效应知识库;其次分析多生物效应模型归纳功能组合基本模式,融合TRIZ方法建立了基于多生物效应的产品功能组合设计过程模型,以生物功能协同实现模式指导产品功能组合设计;最后应用该方法完成湿热地区淡水收集帐篷的设计.研究表明,该功能组合产品设计方法较好地满足了用户对产品的多种需求,具有一定有效性与可行性. 相似文献
4.
Wojciech Tarnowski 《Design Studies》1979,1(1):45-48
A universal model of a structure of an elementary design task solving (the horizontal structure) is presented, emphasizing the activities concerning a choice: appraisal, selection, optimazation and decision. Requirements for the value system are defined and some transformation sequence of a choice criterion from the primary criterion (not strictly defined) to the task criterion (accurately defined, measurable) is proposed. A method is given of determining this criterion for random, fuzzy or deterministic requirements, imposed on a design, design variables and performance variables, using the utility function and weighting factors. 相似文献
5.
《技术计量学》2012,54(4):545-559
AbstractWe present a new method, called analysis-of-marginal-tail-means (ATM), for effective robust optimization of discrete black-box problems. ATM has important applications in many real-world engineering problems (e.g., manufacturing optimization, product design, and molecular engineering), where the objective to optimize is black-box and expensive, and the design space is inherently discrete. One weakness of existing methods is that they are not robust: these methods perform well under certain assumptions, but yield poor results when such assumptions (which are difficult to verify in black-box problems) are violated. ATM addresses this by combining both rank- and model-based optimization, via the use of marginal tail means. The trade-off between rank- and model-based optimization is tuned by first identifying important main effects and interactions from data, then finding a good compromise which best exploits additive structure. ATM provides improved robust optimization over existing methods, particularly in problems with (i) a large number of factors, (ii) unordered factors, or (iii) experimental noise. We demonstrate the effectiveness of ATM in simulations and in two real-world engineering problems: the first on robust parameter design of a circular piston, and the second on product family design of a thermistor network. 相似文献
6.
工程结构中的复合材料层合板的几何参数往往具有随机性质.如何研究随机参数层合板的灵敏度,并对参数进行优化分析,这对正确估计结构设计的可靠性有着非常重要的意义.根据层合板的一阶剪切理论,采用样条有限元法,推导并建立了层合板的振动方程,刚度矩阵,质量矩阵,比例阻尼矩阵以及求解反对称层合板响应灵敏度的计算公式,在基于灵敏度分析的基础上,进行了复合材料层合板的基频分析和优化设计,并用网格法计算最佳铺层角.数值算例验证了算法的有效性. 相似文献
7.
8.
将凹凸棒土(AT)进行提纯和有机改性后, 采用原位聚合法制备了OAT质量分数为1%、 3%、 5%的纳米凹凸棒土/聚乳酸复合材料(OAT/PLA-x)。采用红外、 扫描电镜、 X射线衍射等对复合材料进行了表征, SEM结果表明, 凹凸棒土粒子在复合材料中实现了均匀稳定分散。复合材料的力学性能和综合热性能测试表明: OAT/PLA-3复合材料的拉伸强度、 弹性模量分别比纯PLA增加98.6%和130.0%; 复合材料的热稳定性明显提高。同时, 复合材料的溶液降解速率也明显加快。 相似文献
9.
Conclusions The examples dealt with above cover the most typical kinds of functions n(t) in various types of instruments. If functions n(t) differ from the above, it becomes necessary to use general formulas (4)–(6). The integrals used in applying these formulas are not always expressed in elementary functions, but they can always be calculated by means of approximate methods.The most important problem in the theory of instruments based on nuclear radiations consists of providing design methods which would enable one to find the optimum values for all the parameters [3]. This problem is very complicated, and at present there exist solutions only for some of the simplest instances.In developing engineering design methods, it is necessary, in the first instance, to take account of the fluctuating nature of the signal. 相似文献
10.
Bradley Jones 《Quality Engineering》2016,28(1):98-106
ABSTRACTThe primary aim of screening experiments is to identify the active factors; that is, those having the largest effects on the response of interest. Large factor effects can be either main effects, two-factor interactions (2FIs), or even strong curvature effects. Because the number of runs in a screening experiment is generally on the order of the number of factors, the designs rely heavily on the factor or effect sparsity assumption. That is, practitioners performing such experiments must be willing to assume that only a small fraction of the factors or effects are active.Traditional screening designs such as regular fractional factorial and Plackett-Burman designs employ factors at two levels only. Though they have orthogonal linear main effects, such designs cannot uniquely identify factors with strong curvature effects.Definitive screening designs (DSDs) have many desirable properties that make them appealing alternatives to other screening design methods. They are orthogonal for the main effects. In addition, main effects are orthogonal to all second-order effects and second-order effects are not confounded with each other. In addition, quadratic effects of every factor are estimable. For more than five factors, a DSD projects onto any three factors so that a full quadratic model in those three factors is estimable with reasonable efficiency. As a result, when three or fewer factors turn out to be important, follow-up optimization experiments may not be necessary.All this begs the question, “Are DSDs really as good as they are advertised to be?” This article addresses this question with an even-handed comparison of the various screening approaches. It also considers the sparsity assumption common to all screening designs and provides some guidance for quantifying what effect sparsity means for both traditional screening designs and DSDs. 相似文献
11.
David J. Edwards David H. Q. Truong 《Quality and Reliability Engineering International》2011,27(8):1009-1024
The sequential design approach to response surface exploration is often viewed as advantageous as it provides the opportunity to learn from each successive experiment with the ultimate goal of determining optimum operating conditions for the system or process under study. Recent literature has explored factor screening and response surface optimization using only one three‐level design to handle situations where conducting multiple experiments is prohibitive. The most straightforward and accessible analysis strategy for such designs is to first perform a main‐effects only analysis to screen important factors before projecting the design onto these factors to conduct response surface exploration. This article proposes the use of optimal designs with minimal aliasing (MA designs) and demonstrates that they are more effective at screening important factors than the existing designs recommended for single‐design response surface exploration. For comparison purposes, we construct 27‐run MA designs with up to 13 factors and demonstrate their utility using established design criterion and a simulation study. Copyright 2011 © John Wiley & Sons, Ltd. 相似文献
12.
First excursion probabilities for linear systems by very efficient importance sampling 总被引:2,自引:0,他引:2
An analytical study of the failure region of the first excursion reliability problem for linear dynamical systems subjected to Gaussian white noise excitation is carried out with a view to constructing a suitable importance sampling density for computing the first excursion failure probability. Central to the study are ‘elementary failure regions’, which are defined as the failure region in the load space corresponding to the failure of a particular output response at a particular instant. Each elementary failure region is completely characterized by its design point, which can be computed readily using impulse response functions of the system. It is noted that the complexity of the first excursion problem stems from the structure of the union of the elementary failure regions. One important consequence of this union structure is that, in addition to the global design point, a large number of neighboring design points are important in accounting for the failure probability. Using information from the analytical study, an importance sampling density is proposed. Numerical examples are presented, which demonstrate that the efficiency of using the proposed importance sampling density to calculate system reliability is remarkable. 相似文献
13.
Leonard A. Perry Douglas C. Montgomery John W. Fowler 《Quality and Reliability Engineering International》2001,17(6):429-438
The output quality or performance characteristics of a product often depend not only on the effect of the factors in the current process but on the effect of factors from preceding processes. Statistically‐designed experiments provide a systematic approach to study the effects of multiple factors on process performance by offering a structured set of analyses of data collected through a design matrix. One important limitation of experimental design methods is that they have not often been applied to multiple sequential processes. The objective is to create a first‐order experimental design for multiple sequential processes that possess several factors and multiple responses. The first‐order design expands the current experimental designs to incorporate two processes into one partitioned design. The designs are evaluated on the complexity of the alias structure and their orthogonality characteristics. The advantages include a decrease in the number of experimental design runs, a reduction in experiment execution time, and a better understanding of the overall process variables and their influence on each of the responses. Copyright © 2001 John Wiley & Sons, Ltd. 相似文献
14.
光声断层成像(Optoacoustic Tomography,OAT)是一种新兴的生物医学成像技术,在基础医学研究与临床实践中具有重要作用。针对现有光声断层成像空间分辨率较低的问题,提出了一种结合物理点扩散函数(Point Spread Function,PSF)模型和卷积神经网络(Convolutional Neural Network,CNN)的新型高分辨光声重建网络方法(Physical Attention U-Net,Phys-AU-Net)。该方法采用无监督学习策略,结合物理PSF模型和基于注意力机制的U-Net网络。其中,物理PSF模型用于完成对衍射受限机制的模拟,基于注意力机制的U-Net网络用于实现对高密度重叠吸收体图像的特征提取。在二者共同作用下,Phys-AU-Net突破了声衍射极限对于OAT成像空间分辨率的限制。实验结果表明,Phys-AU-Net能够有效实现对声衍射受限光声断层图像的高分辨重建,其性能相较于U-Net网络具有较大程度提升,在结构相似性指标(Structural Similarity,SSIM)方面提升了43.5%,在峰值信噪比(Peak Sign... 相似文献
15.
16.
Design of experimental bench and internal pressure measurement of scroll compressor with refrigerant injection 总被引:1,自引:1,他引:0
Baolong Wang Xianting Li Wenxing Shi Qisen Yan 《International Journal of Refrigeration》2007,30(1):179-186
Experiments on the inner compression process of scroll compressor with refrigerant injection can reveal the essence of refrigerant injection. The difficulty of the experiment is the design of location of measuring ports, measuring system of dynamic pressure and design of the injection system. Focusing on the dynamic pressure measurement of inner compression process during refrigerant injection, an integrated bench design method for refrigerant injection research in scroll compressor is presented in this paper. The location design of injection ports and measuring ports, frequency spectrum analysis of pressure signal, selection of the sensor type and configuration, and design of the pressure-leading system are expressed, respectively. Finally, a test bench is set up. Based on it, several elementary experiments were carried out. The results show that: this design method solves most problems in the experimental research of scroll compressor with refrigerant injection and works reliably; the refrigerant injection effects the majority of the inner compression process and should not be considered as a transient process; gas injection can increase the system performance greatly and there is an optimal injection pressure for a certain scroll compressor. 相似文献
17.
目的为了改善企业服务质量,提升顾客满意度,提出一种基于QFD-IPA模型的服务设计方法。方法从服务设计系统分析入手,围绕如何进行有效服务设计,利用质量功能展开方法,确定服务质量要素的权重,通过IPA模型对影响顾客满意度的关键服务质量要素进行重要性与绩效分析。结果通过该方法对各项服务质量要素优势与不足的分析,为企业对自身状况的认知和服务设计改进提供了重要依据,并将该方法应用于12306手机订票系统服务设计中,证明了该方法的可行性和有效性。结论为服务设计理论研究和实践应用拓展了一条新途径。 相似文献
18.
基于层次分析法的儿童智能手表设计评价研究 总被引:2,自引:2,他引:0
目的解决产品设计方案后期评价问题。方法提出了一种基于层次分析法(AHP)的设计评估方法,首先将产品分解为若干待评价的指标并对其进行评价打分,整理数据并计算出各指标的权重值,最终得出各方案的综合得分进而排序。结果通过3款儿童智能手表的市场销售调查数据验证了该排序结果的准确性,为产品设计方案的有效评估提供了参考。结论将层次分析法引入设计评价中,可有效降低设计评价中待评估要素多、对评估者的经验知识依赖大、各因素与最终方案之间的影响关系难以判断等问题所带来的不利影响。 相似文献
19.
Chia-Wei LiuYew-Khoy Chuah 《International Journal of Refrigeration》2011,34(3):816-823
An optimal approach temperature (OAT) control strategy is proposed for resetting the condensing water temperature hourly, so to maximize the performance of the combined water chiller and cooling tower system. A system performance factor is introduced for evaluation of the system performance. A regression function is presented for the calculation of an optimal condensing water temperature at each hour of air-conditioning operation. The parameters in the regression function include the ambient wet bulb temperature, the chiller load ratio at the hour, and a dimensionless relative efficiency of chiller and cooling tower. The regression function has an R2 close to 1 compared to the computed results. When applied to two cities in Taiwan, the OAT control strategy has a potential to save energy more than 4% on an annual basis. The OAT control strategy is most advantageous when applied to regions with large seasonal variation of wet bulb temperature. 相似文献
20.
In body-centered cubic (bcc) metals, an unambiguous determination of the elementary slip planes remains difficult owing to several possible interpretations of the glide activity, of slip steps on the specimen surface or features of the dislocation microstructure. In this article, a method is proposed to determine the elementary slip planes in bcc metals based on the line directions of sessile junctions resulting from the interaction of mobile dislocations with \({a}/2\langle 111\rangle \) Burgers vector. The proposed method allows to determine slip activity inside a material and not at its surface, where other effects may play a role. It is in principle applicable to determining the elementary slip plane in any crystalline material. Particularly, it may help to resolve a long-standing debate of the nature of the elementary slip planes in bcc metals. 相似文献