首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The average number of inspections in fault diagnosis to find the actual minimal cutsets (MCS) causing a system failure is found to be dependent on the inspection sequence adopted. However, this average number of inspections is proved to be lower-bounded by the entropy of cut set importances, which may be used to estimate how difficult it is to find the actual MCS. An inspection on a component whose Fussell-Vesely importance is nearest to 0·5 leads to the discovery of the actual MCS by a minimum number of inspections.  相似文献   

2.
The best inspection strategy is generally obtained by minimizing the average number of inspections required to find the actual minimal cutset (MCS). This is an acceptable criterion if every inspection of a basic event costs the same. A criterion for the limiting case of equiprobable MCSs is presented that minimizes the cost rather than the average number of inspections.  相似文献   

3.
Endeavors to maximize the safe service lives of aeroengine components have led to a variety of life assessment methodologies. The following are reviewed in the paper: Life-to-first-crack, Databank Lifing, Damage Tolerance and Damage Mechanism based procedures. Their service implementation involves a variety of aspects of which the following are briefly discussed: stress analysis, defects and component life extension methods. Building on several of the concepts discussed, a lifing methodology based on risk regulation of inspection is then introduced. By varying the inspection intervals over time, the number of inspections can be minimized while ensuring that the risk of a failure does not exceed an acceptable level. Given inspection, it is shown that the actual safe service life of a component can be more strongly determined by the probability of crack detection than by the minimum detectable crack size.  相似文献   

4.
Certain regulated industries are monitored by inspections that ensure adherence (compliance) to regulations. These inspections can often be with very short notice and can focus on particular aspects of the business. Failing such inspections can bring great losses to a company; thus, evaluating the risks of failure against various inspection strategies can help it ensure a robust operation. In this paper, we investigate a game-theoretic setup of a production planning problem under uncertainty in which a company is exposed to the risk of failing authoritative inspections due to non-compliance with enforced regulations. In the proposed decision model, the inspection agency is considered an adversary to the company whose production sites are subject to inspections. The outcome of an inspection is uncertain and is modeled as a Bernoulli-distributed random variable whose parameter is the mean of non-compliance probabilities of products produced at the inspected site and, therefore, is a function of production decisions. If a site fails an inspection, then all its products are deemed adulterated and cannot be used, jeopardizing the reliability of the company in satisfying customers’ demand. In the proposed framework, we address two sources of uncertainty facing the company. First, through the adversarial setting, we address the uncertainty arising from the inspection process as the company does not know a priori which sites the agency will choose to inspect. Second, we address data uncertainty via robust optimization. We model products’ non-compliance probabilities as uncertain parameters belonging to polyhedral uncertainty sets and maximize the worst-case expected profit over these sets. We derive tractable and compact formulations in the form of a mixed integer program that can be solved efficiently via readily available standard software. Furthermore, we give theoretical insights into the structure of optimal solutions and worst-case uncertainties. The proposed approach offers the flexibility of matching solutions to the level of conservatism of the decision maker via two intuitive parameters: the anticipated number of sites to be inspected, and the number of products at each site that are anticipated to be at their worst-case non-compliance level. Varying these parameters when solving for the optimal products allocation provides different risk-return tradeoffs and thus selecting them is an essential part of decision makers’ strategy. We believe that the robust approach holds much potential in enhancing reliability in production planning and other similar frameworks in which the probability of random events depends on decision variables and in which the uncertainty of parameters is prevalent and difficult to handle.  相似文献   

5.
An iterative method to treat the inverse problem of detecting cracks and voids in two‐dimensional piezoelectric structures is proposed. The method involves solving the forward problem for various flaw configurations, and at each iteration, the response of piezoelectric material is minimized at known specific points along the boundary to match measured data. Extended finite element method (XFEM) is employed for solving the forward problem as it allows the use of a single regular mesh for a large number of iterations with different flaw geometries. The minimization of cost function is performed by multilevel coordinate search (MCS) method. The algorithm is an intermediate between purely heuristic methods and methods that allow an assessment of the quality of the minimum obtained and is in spirit similar to the direct method for global optimization. In this paper, the XFEM‐MCS methodology is applied to two‐dimensional electromechanical problems where flaws considered are straight cracks and elliptical voids. The results show that this methodology can be effectively employed for damage detection in piezoelectric materials. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

6.
This paper presents a study on the effect of blow-holes on the reliability of a cast component. The most probable point (MPP) based univariate response surface approximation is used for evaluating reliability. Crack geometry, blow-hole dimensions, external loads and material properties are treated as independent random variables. The methodology involves novel function decomposition at a most probable point that facilitates the MPP-based univariate response surface approximation of the original multivariate implicit limit state/performance function in the rotated Gaussian space. Once the approximate form of the original implicit limit state/performance function is defined, the failure probability can be obtained by Monte Carlo simulation (MCS), importance sampling technique, and first- and second-order reliability methods (FORM/SORM). FORTRAN code is developed to automate calls to ABAQUS for numerically simulating responses at sample points, to construct univariate response surface approximation, and to subsequently evaluate the failure probability by MCS, importance sampling technique, and FORM/SORM.  相似文献   

7.
With an increase in complex designs and tighter tolerances, the Coordinate Measuring Machine inspection process has become increasingly more advanced. By inspection planning, design data can be transferred to an inspection system and an entire inspection operation can be carried out with a minimum of time and with reduced uncertainty. The current need is to automate this process completely so that the inspection plan can be generated directly from the design information. Two modules of inspection planning, i.e. selection of part orientation and probe orientation sequencing, have not been dealt with properly. Also, some important factors for the selection of part orientation have been neglected and proper weights have not been given to the probe-orientation sequencing criteria. An attempt was made to overcome these limitations. Both problems have been approached as the ranking of a number of alternatives based on multiple criteria, where each criterion has unequal importance. To get the optimum probe-orientation sequence and stable part orientation, fuzzy logic was applied. Fuzzy sets were obtained and combined using a suitable methodology. To explain and validate the proposed methodology, an example part was taken. As a practical case, an engine block was considered and the results presented.  相似文献   

8.
张华明 《工业工程》2009,12(5):45-49
一个企业的组织是由若干个职能部门组成的。为了降低组织的决策成本,建立了度量决策成本的模型,该模型的成本与用来进行决策的信息振荡相关。该模型给出了在不同的情况下选择决策模式的方法。该模型显示:在系统振荡比个别振荡大得多的情况下,如果对部门之间的协调要求比较高,则适宜采用同化信息体制,如部门之间的竞争关系大于协调要求,则采用信息异化体制;相反,在个别振荡比系统振荡大得多的情况下,采用分权体制成本更低。在系统振荡与个别振荡相差不大的情况下,如对部门之间的协调要求比较高,则适宜采用水平体制,反之信息分散化体制成本较低。  相似文献   

9.
The paper presents a general method and procedure for fatigue reliability assessment integrating automated ultrasonic non-destructive inspections. The basic structure of an automated ultrasonic inspection system is presented. Fatigue reliability assessment methodology is developed using uncertainty quantification models for detection, sizing, and fatigue model parameters. The probability of detection model is based on a classical log-linear model coupling the actual flaw size with the ultrasonic inspection reported size. Using probabilistic modeling, the distribution of the actual flaw size is derived. Reliability assessment procedure using ultrasonic inspection data is suggested. A steam turbine rotor example with realistic ultrasonic inspection data is presented to demonstrate the overall method. Calculations and interpretations of assessment results based on risk recommendations for industrial applications are given.  相似文献   

10.
Surface finish tends to be decisive in a large number of applications and, in general, it must be corrected by means of finishing operations. It is also highly important that the characteristics required from the products obtained should be determined beforehand, and on this basis the operating conditions that most closely suit the materials to be employed and their characteristics should be chosen. Factorial design will be employed to study the effects of these operating conditions. Although factorial design is a known technique for analysing the effect of variables over an objective function, the uncertainty of these variables is often ignored. This means that the true behaviour of the objective function cannot be explained. Most of the previous investigators have studied the effect of cutting variables on surface roughness and many different models have been proposed, but they do not take uncertainty in the measurements into accountand this uncertainty is often ignored - in order to perform the regression analysis. In this paper, a development of models that would allow us to determine the surface quality of parts, obtained by turning processes, is carried out, using the response surface methodology and considering the uncertainty due to the process and the measuring instruments. In addition, a methodology for determining the capability of manufacturing processes is presented.  相似文献   

11.
Structural failure investigations can be strongly influenced by high levels of uncertainty in modelling parameters, particularly in the case of historical constructions. This suggests forensic analysts to perform probabilistic simulations, allowing a risk-informed diagnosis and prognosis of structural failures. In this study, a failure investigation methodology including uncertainty characterisation, modelling and propagation is presented and applied to a historic piperno stone balcony, the collapse of which caused four casualties. High uncertainty in physical and mechanical properties of piperno stone, which has been widely used for a long time in the architectural heritage of Naples and Southern Italy, motivated stochastic finite element (SFE) simulations to account for spatial variability of material properties throughout the balcony. Based on field inspections, laboratory surveys and experimental testing, a three-dimensional finite element (FE) model with four alternative restraint conditions was developed and material properties were statistically characterised. Experimental data were found to be in agreement with those available in the literature for similar piperno stones. Deterministic nonlinear FE simulations with mean material properties showed a major influence of restraint conditions, providing an initial identification of the most realistic model that was able to reproduce the observed damage. Then, SFE simulations were performed on structural models having random fields of material properties. It is shown that the selected SFE model of the balcony had a mean load capacity very close to the total load expected at the time of collapse, allowing the lowest uncertainty level in the output of forensic analysis.  相似文献   

12.
An artificial lateral line (ALL) system consists of a set of flow sensors around a fish-like body. An ALL system aims to identify surrounding moving objects, a common example of which is a vibrating sphere, called a dipole. Accurate identification of a vibrating dipole is a challenging task because of the presence of different types of uncertainty in measurements or in the underlying flow model. Proper selection of design parameters of the ALL system, including the shape, size, number and location of the sensors, can highly influence the identification accuracy. This study aims to find such an optimum design by developing a specialized bi-level optimization methodology. It identifies and simulates different sources of uncertainty in the problem formulation. A parametric fitness function addresses computational and practical goals and encompasses the effect of different sources of uncertainty. It can also analyse the trade-off between localization accuracy and the number of sensors. Comparison of the results for different extents of uncertainty reveals that the optimized design strongly depends on the amount of uncertainty as well as the number of sensors. Consequently, these factors must be considered in the design of an ALL system. Another highlight of the proposed bi-level optimization methodology is that it is generic and can be readily extended to solve other noisy and nested optimization problems.  相似文献   

13.
14.
This paper presents a method for evaluating an expected damage associated with disintegrating complex networks with a given topology into isolated sub-networks (clusters) as a result of intentional attack on randomly chosen network links. The method is based on a multi-dimensional spectra approach for evaluating the probability of network disintegration into a given number of sub-networks when a fixed number of randomly chosen links is eliminated. It also uses the contest success function that evaluates destruction probability of individual links as a function of per-link attack and defense efforts. It is assumed that the defender has no information about the attacker's actions and the attacker has no information about the network structure. The method allows the analysts to compare different network topologies and to choose one with the minimal expected damage under conditions of uncertainty. Illustrative examples are presented.  相似文献   

15.
Determination of patients’ requirements (PRs) importance is a critical issue of medical service design. Quality function deployment (QFD) is one of the very effective customer-driven quality system tools typically applied to fulfil PRs. Often the patients cannot easily express their judgements on PRs’ importance with exact numerical values, and they usually present their judgements in different scales. Therefore, this paper aims at providing a systematic method to simultaneously deal with PRs’ fuzziness and multi-granularity in the QFD. Compared to previous research, its contribution is threefold. First, it proposes use of the 2-tuple linguistic model which can effectively manage the imprecise and vague evaluation information in QFD because of its accuracy and no information loss. Second, it develops a new 2-tuple transformation function to solve the unification problem of multi-granular linguistic judgements in QFD. Third, it proposes a qualitative approach to deal with uncertainty existed extensively in establishing modified factors. Finally, a practical case of hospital PRs is provided to illustrate its feasibility and effectiveness of the proposed methodology.  相似文献   

16.
The decision as to whether a contaminated site poses a threat to human health and should be cleaned up relies increasingly upon the use of risk assessment models. However, the more sophisticated risk assessment models become, the greater the concern with the uncertainty in, and thus the credibility of, risk assessment. In particular, when there are several equally plausible models, decision makers are confused by model uncertainty and perplexed as to which model should be chosen for making decisions objectively. When the correctness of different models is not easily judged after objective analysis has been conducted, the cost incurred during the processes of risk assessment has to be considered in order to make an efficient decision. In order to support an efficient and objective remediation decision, this study develops a methodology to cost the least required reduction of uncertainty and to use the cost measure in the selection of candidate models. The focus is on identifying the efforts involved in reducing the input uncertainty to the point at which the uncertainty would not hinder the decision in each equally plausible model. First, this methodology combines a nested Monte Carlo simulation, rank correlation coefficients, and explicit decision criteria to identify key uncertain inputs that would influence the decision in order to reduce input uncertainty. This methodology then calculates the cost of required reduction of input uncertainty in each model by convergence ratio, which measures the needed convergence level of each key input's spread. Finally, the most appropriate model can be selected based on the convergence ratio and cost. A case of a contaminated site is used to demonstrate the methodology.  相似文献   

17.
Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.  相似文献   

18.
结构不确定性量化是定量参数不确定性传递到结构响应的不确定性大小。传统的蒙特卡洛法需要进行大量的数值计算,耗时较高,难以应用于大型复杂结构的不确定性量化。代理模型方法是基于少量训练样本建立的近似数学模型,可代替原始物理模型进行不确定性量化以提高计算效率。针对高精度样本计算成本高而低精度样本计算精度低的问题,该文提出了整合高、低精度训练样本的广义协同高斯过程模型。基于该模型框架推导了结构响应均值和方差的解析表达式,实现了结构不确定性的量化解析。采用三个空间结构算例来验证结构不确定性量化解析方法的准确性,并与传统的蒙特卡洛法、协同高斯过程模型和高斯过程模型的计算结果对比,结果表明所提方法在计算精度和效率方面均具有优势。  相似文献   

19.
The conductance of an NPL orifice in the molecular regime is constant and can be exactly calculated from the geometric dimensions. For smaller Knudsen numbers the conductance increases and correction functions are employed to reduce the uncertainty in this range of pressures. The conductance is also constant in the viscous regime during flow into a vacuum and can also be calculated. A suitable function has been chosen with one free parameter, which is constant for both very low and sufficiently high pressures and the parameter was determined on the basis of the experimentally measured course of the conductance at the borderline between molecular and transitional flow. The function fits the experimental data very well and can be used to calculate the conductance of the orifice up to Knudsen number ≈1.  相似文献   

20.
In this study, a Reliability-Based Optimization (RBO) methodology that uses Monte Carlo Simulation techniques, is presented. Typically, the First Order Reliability Method (FORM) is used in RBO for failure probability calculation and this is accurate enough for most practical cases. However, for highly nonlinear problems it can provide extremely inaccurate results and may lead to unreliable designs. Monte Carlo Simulation (MCS) is usually more accurate than FORM but very computationally intensive. In the RBO methodology presented in this paper, limit state approximations are used in conjunction with MCS techniques in an approximate MCS-based RBO that facilitates the efficient calculation of the probabilities of failure. A FORM-based RBO is first performed to obtain the initial limit state approximations. A Symmetric Rank-1 (SR1) variable metric algorithm is used to construct and update the quadratic limit state approximations. The approximate MCS-based RBO uses a conditional-expectation-based MCS, that was chosen over indicator-based MCS because of the smoothness of the probability of failure estimates and the availability of analytic sensitivities. The RBO methodology was implemented for an analytic test problem and a higher-dimensional, control-augmented-structure test problem. The results indicate that the SR1 algorithm provides accurate limit state approximations (and therefore accurate estimates of the probabilities of failure) for these test problems. It was also observed that the RBO methodology required two orders of magnitude fewer analysis calls than an approach that used exact limit state evaluations for both test problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号