首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
We aim to analyze the effects of component level reliability data, including both catastrophic failures and margin failures, on system level reliability. While much work has been done to analyze margins and uncertainties at the component level, a gap exists in relating this component level analysis to the system level. We apply methodologies for aggregating uncertainty from component level data to quantify overall system uncertainty. We explore three approaches towards this goal, the classical Method of Moments (MOM), Bayesian, and Bootstrap methods. These three approaches are used to quantify the uncertainty in reliability for a system of mixed series and parallel components for which both pass/fail and continuous margin data are available. This paper provides proof of concept that uncertainty quantification methods can be constructed and applied to system reliability problems. In addition, application of these methods demonstrates that the results from the three fundamentally different approaches can be quite comparable.  相似文献   

2.
Design methods have been studied by researchers for decades. Academia considers their impact on industry to be insufficient. The objective of this research is to understand the use and impact of design methods in the context of a specific company, Volvo Car Corporation (VCC), by describing the behaviour of engineers in relation to methods, to assist in the future development of design methods and tools. We mainly concentrate on concept selection methods because of their relevance in this company. The data presented is the result of qualitative research carried out during 4 years at VCC, where the authors were located as researchers. The research shows that many methods are employed besides those with an academic name, that some in-company methods used contain improvements to methods researched by academia, that some modifications to academic methods lead to unreliable results, and that there is a lack of objectivity in method modification. For these reasons, the authors suggest further research on understanding the principles of successful and unreliable modification of concept selection methods.  相似文献   

3.
Clean Technologies and Environmental Policy - Nowadays, a large part of daily activities is associated with the use of fossil fuels. Therefore, the excessive exploitation and related pollution have...  相似文献   

4.
In the present scenario, industries are implementing sustainability concepts because of the consumers’ awareness regarding the eco-friendly products and services and in need to arrive at a concept that will not be economically beneficial alone but it should be sustainable. While implementing, mostly the sustainability context is discussed from two perspectives such as environmental, economic, and social or in material, product and process orientation in manufacturing units. Since sustainable development is a multi-dimensional concept, Multi Criterion Decision Making (MCDM) method can be used to find out the best alternative among the given alternatives. In this context, one of the MCDM method namely ELECTRE II (ELimination Et Choix Traduisant la REalite) is used to find out the best orientation among the three orientations such as material, product, and process.  相似文献   

5.
EPA's Great Lakes National Program Office (GLNPO) is leading one of the most extensive studies of a lake ecosystem ever undertaken. The Lake Michigan Mass Balance Study (LMMB Study) is a coordinated effort among state, federal, and academic scientists to monitor tributary and atmospheric pollutant loads, develop source inventories of toxic substances, and evaluate the fate and effects of these pollutants in Lake Michigan. A key objective of the LMMB Study is to construct a mass balance model for several important contaminants in the environment: PCBs, atrazine, mercury, and trans-nonachlor. The mathematical mass balance models will provide a state-of-the-art tool for evaluating management scenarios and options for control of toxics in Lake Michigan. At the outset of the LMMB Study, managers recognized that the data gathered and the model developed from the study would be used extensively by data users responsible for making environmental, economic, and policy decisions. Environmental measurements are never true values and always contain some level of uncertainty. Decision makers, therefore, must recognize and be sufficiently comfortable with the uncertainty associated with data on which their decisions are based. The quality of data gathered in the LMMB was defined, controlled, and assessed through a variety of quality assurance (QA) activities, including QA program planning, development of QA project plans, implementation of a QA workgroup, training, data verification, and implementation of a standardized data reporting format. As part of this QA program, GLNPO has been developing quantitative assessments that define data quality at the data set level. GLNPO also is developing approaches to derive estimated concentration ranges (interval estimates) for specific field sample results (single study results) based on uncertainty. The interval estimates must be used with consideration to their derivation and the types of variability that are and are not included in the interval.  相似文献   

6.
The concept of uncertainty in measurements and quantitative characteristics based on that concept are relatively new in the history of measurements. The differences between this new concept and that of measurement errors are examined. Translated from Izmeritel'naya Tekhnika, No. 5, pp. 24–26, May, 2000.  相似文献   

7.
Many viruses evolve rapidly. For example, haemagglutinin (HA) of the H3N2 influenza A virus evolves to escape antibody binding. This evolution of the H3N2 virus means that people who have previously been exposed to an influenza strain may be infected by a newly emerged virus. In this paper, we use Shannon entropy and relative entropy to measure the diversity and selection pressure by an antibody in each amino acid site of H3 HA between the 1992–1993 season and the 2009–2010 season. Shannon entropy and relative entropy are two independent state variables that we use to characterize H3N2 evolution. The entropy method estimates future H3N2 evolution and migration using currently available H3 HA sequences. First, we show that the rate of evolution increases with the virus diversity in the current season. The Shannon entropy of the sequence in the current season predicts relative entropy between sequences in the current season and those in the next season. Second, a global migration pattern of H3N2 is assembled by comparing the relative entropy flows of sequences sampled in China, Japan, the USA and Europe. We verify this entropy method by describing two aspects of historical H3N2 evolution. First, we identify 54 amino acid sites in HA that have evolved in the past to evade the immune system. Second, the entropy method shows that epitopes A and B on the top of HA evolve most vigorously to escape antibody binding. Our work provides a novel entropy-based method to predict and quantify future H3N2 evolution and to describe the evolutionary history of H3N2.  相似文献   

8.
Providing global connectivity with high speed and guaranteed quality at any place and any time is now becoming a reality due to the integration and co-ordination of different radio access technologies. The internetworking of existing networks with diverse characteristics has been considered attractive to meet the incredible development of interactive multimedia services and ever-growing demands of mobile users. Due to the diverse characteristics of heterogeneous networks, several challenges have to be addressed in terms of quality of service (QoS), mobility management and user preferences. To achieve this goal, an optimal network selection algorithm is needed to select the target network for maximizing the end user satisfaction. The existing works do not consider the integration of utility function with mobile terminal mobility characteristics to minimize ping-pong effects in the integrated networks. An integrated multicriteria network selection algorithm based on multiplicative utility function and residual residence time (RRT) estimation is proposed to keep the mobile users always best connected. Multiplicative weighted utility function considers network conditions, application QoS and user preferences to evaluate the available networks. In this paper, the proposed scheme is implemented with two mainstreams (pedestrian users and high-velocity users). For high-velocity users, RRT and adaptive residence time threshold are also considered to keep the probability of handover failures and unnecessary handovers within the limits. Monte-Carlo simulation results demonstrate that the proposed scheme outperforms against existing approaches.  相似文献   

9.
This paper examines the feasibility and value of using nonparametric variance-based methods to supplement parametric regression methods for uncertainty analysis of computer models. It shows from theoretical considerations how the usual linear regression methods are a particular case within the general framework of variance-based methods. Examples of strengths and weaknesses of the methods are demonstrated analytically and numerically in an example. The paper shows that relaxation of linearity assumptions in nonparametric variance-based methods comes at the cost of additional computer runs.  相似文献   

10.
Jar-test is a well-known tool for chemicals selection for physical-chemical wastewater treatment. Jar-test results show the treatment efficiency in terms of suspended matter and organic matter removal. However, in spite of having all these results, coagulant selection is not an easy task because one coagulant can remove efficiently the suspended solids but at the same time increase the conductivity or increase considerably the sludge production containing chemicals and toxic dyes. This makes the final selection of coagulants very dependent on the relative importance assigned to each measured parameter. In this paper, the use of multicriteria decision analysis (MCDA) is proposed to help on the selection of the coagulant and its concentration in the physical-chemical wastewater treatment, since textile wastewater contains hazardous substances. Therefore, starting from the parameters fixed by the jar-test results, these techniques will allow to weight these parameters, according to the judgements of wastewater experts, and to establish priorities among coagulants. Two well-known MCDA techniques have been used: analytic hierarchic process (AHP) and preference ranking organization method for enrichment evaluations (PROMETHEEs) and their results were compared. The method proposed has been applied to the particular case of textile wastewaters. The results obtained show that MCDA techniques are useful tools to select the chemicals for the physical-technical treatment.  相似文献   

11.
《中国测试》2015,(Z1):138-142
为确保质量参数量值传递的准确性,保证标准砝码检定装置的溯源性,该文对标准砝码检定装置的不确定度评定方法进行深入研究。不确定度评定中综合考虑测量重复性、仪器误差等因素对测量结果的影响,最终得到了不确定度评定结果。评定结果小于仪器误差的三分之一,满足精度要求。  相似文献   

12.
化学分析中测量不确定度的一般评定方法   总被引:8,自引:0,他引:8  
本文总结了化学分析中测量不确定度的通用分量及其相互关系,讨论了化学分析不确定度的一般评定方法。  相似文献   

13.
扩展不确定度中包含因子的确定方法   总被引:1,自引:0,他引:1  
通过分析被测量可能值的概率分布,给出展伸不确定度包含因子的确定方法,并对有效自由度的非整数值所对应的包含因子的取值进行了详细的研究.采用不同的插值函数研究了有效自由度小于5的情况,得到较为准确的结果.  相似文献   

14.
To satisfy the requirements of the manufacturing industries, material selection is considered as an important task. Wrongly chosen material may unnecessarily increase the product cost and lead to early failure of the product. While developing a product, the most suitable material is to be selected based on the requirements of the product and the available material properties. These material properties must match with the product requirements. To aid this material selection decision-making process, several mathematical tools and techniques have already been developed by the past researchers. Some of those techniques are computationally complex and their solution accuracy is often affected by the introduction of additional mathematical parameters. In this paper, two conceptually simple but strong mathematical techniques, i.e. utility concept and desirability function approaches are proposed to solve four material selection problems. These two methods are based on the quality characteristic values of the considered material alternatives for arriving at the satisfactory results. Almost a close match between the rankings of the alternatives obtained by these two methods and those derived by the past researchers confirms the suitability of both these approaches for solving material selection problems.  相似文献   

15.
The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which “experimental” data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport “universe”, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new “experiments” within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies.  相似文献   

16.
The decision sciences have emerged over the past 300 years with contributions from many highly recognized individuals. Yet, by and large, these results have not been incorporated into engineering design decision methods, nor have these methods been validated. The result is that design decision methods commonly exhibit undesirable behaviors that are clearly evident when one knows what to look for. Indications of bad behavior are presented, and a framework for validation of engineering design alternative selection methods is suggested.  相似文献   

17.
18.
The letter discusses certain aspects of introducing the Guide to the Expression of Uncertainty in Measurement, and the draft of Supplement 1 to this Guide for evaluating calibration results. __________ Translated from Izmeritel’naya Tekhnika, No. 3, pp. 70–72, March, 2008.  相似文献   

19.
Expert systems are gaining rapidly in importance in several areas of science, and in chemistry in particular. One of the most important factors that determines the success of an expert system is the representation of the knowledge. To understand better the suitability of the different knowledge representation techniques for the selection of methods in analytical chemistry, a test knowledge base was developed for the high-performance liquid chromatographic analysis of pharmaceutical compounds. This knowledge base has been used to study which features of knowledge representation are necessary to describe analytical chemical knowledge in a natural and efficient way. It is concluded that no single representation method is optimal. A combination of production rules and frame structures seems to be much more suitable. The possibility of consulting external databases and programs is also a very important aspect.  相似文献   

20.
In the optimal plastic design of mechanical structures one has to minimize a certain cost function under the equilibrium equation, the yield condition and some additional simple constraints, like box constraints. A basic problem is that the model parameters and the external loads are random variables with a certain probability distribution. In order to get reliable/robust optimal designs with respect to random parameter variations, by using stochastic optimization methods, the original random structural optimization problem must be replaced by an appropriate deterministic substitute problem. Starting from the equilibrium equation and the yield condition, the problem can be described in the framework of stochastic (linear) programming problems with ‘complete fixed recourse’. The main properties of this class of substitute problems are discussed, especially the ‘dual decomposition’ data structure which enables the use of very efficient special purpose LP-solvers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号