首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
为了可靠地检出并识别焊缝缺陷,提出了一种基于特征评估和概率神经网络(PNN)的超声自动识别方法.该方法分别采用小波包和经验模式分解法对缺陷信号进行分解,提取原始信号和各分解信号的时域无量纲参数组成联合特征,并计算其评估因子,根据评估因子的大小选取敏感特征作为PNN的输入,从而实现不同焊缝缺陷类型的自动识别.通过对飞机起落架焊缝进行机上原位检测,实验结果表明,上述方法能够从大量的缺陷特征中筛选出敏感特征,克服了人为选择缺陷敏感特征的盲目性,减小了PNN规模,提高了分类准确率和检测效率.该方法在飞机的外场原位测试中具有很好的应用前景.  相似文献   

2.
Unit load testing of boxes reported in the literature typically uses empty boxes to explore the impact and interactions among box configurations, pallet support and other environmental parameters. However, this approach leads to failure in the weakest box in the unitized structure, while in the field, failure almost always occurs in the bottom box, which may or may not be the weakest. We find in this paper that mathematically, numerically and experimentally, forcing box failure to the bottom results in higher test values. While this occurs naturally for boxes in use in the field, it is an interaction which to date has been overlooked by researchers examining box performance in the lab. The impact on box estimation can be on the order of 5% or more, which can be as significant as some of the environmental factors we are working to quantify, and which can have significant cost implications. To improve the assessment used in the industry to account for the impact of a box's ‘in use environment’ on its performance, we need further testing on configurations where the boxes are loaded. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

3.
A new method is developed here for the real‐time integration of the equations of solid dynamics based on the use of proper orthogonal decomposition (POD)–proper generalized decomposition (PGD) approaches and direct time integration. The method is based upon the formulation of solid dynamics equations as a parametric problem, depending on their initial conditions. A sort of black‐box integrator that takes the resulting displacement field of the current time step as input and (via POD) provides the result for the subsequent time step at feedback rates on the order of 1 kHz is obtained. To avoid the so‐called curse of dimensionality produced by the large amount of parameters in the formulation (one per degree of freedom of the full model), a combined POD–PGD strategy is implemented. Examples that show the promising results of this technique are included. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

4.
Covering arrays (CAs) are combinatorial objects with interesting features that have practical applications such as experimental design and fault detection in hardware and software. We introduce a graph‐based postoptimization (GBPO) approach to reduce the size of CAs exploiting the redundancy in CAs previously constructed. To evidence the advantages of using GBPO, we have instantiated it with 2 sets of CAs: (1) 560 CAs of strength 2≤t≤6, alphabet 2≤v≤6, and parameters 3≤k≤32 generated by an optimized version of In‐Parameter‐Order‐Generalized (IPOG‐F) and GBPO improved all CAs, and 37 cases matched the best‐known upper bounds; and (2) 32 CAs of strength t=2, alphabet 3≤v≤6, and number of parameters 8≤k≤146; in this set, 16 cases were improved, and 16 cases were matched.  相似文献   

5.
Abstract: An experimental test series was carried out to determine input parameters for a well‐known continuum damage mechanics elementary ply plasticity model. A full suite of data was obtained for a carbon fibre and an S2‐glass fibre‐reinforced composite material, both currently used in the aerospace industry. Models were implemented using the experimentally determined input parameters and predictions for in‐plane behaviour found good agreement with experiments for both material systems. In addition, model predictions for cyclic loading accurately captured reload moduli and plastic strain magnitude.  相似文献   

6.
This paper presents an automated POCOFAN-POFRAME algorithm that partitions large combinational digital VLSI circuits for pseudo exhaustive testing. In this paper, a simulation framework and partitioning technique are presented to guide VLSI circuits to work under with fewer test vectors in order to reduce testing time and to develop VLSI circuit designs. This framework utilizes two methods of partitioning Primary Output Cone Fanout Partitioning (POCOFAN) and POFRAME partitioning to determine number of test vectors in the circuit. The key role of partitioning is to identify reconvergent fanout branch pairs and the optimal value of primary input node N and fanout F partitioning using I-PIFAN algorithm. The number of reconvergent fanout and its locations are critical for testing of VLSI circuits and design for testability. Hence, their selection is crucial in order to optimize system performance and reliability. In the present work, the design constraints of the partitioned circuit considered for optimization includes critical path delay and test time. POCOFAN-POFRAME algorithm uses the parameters with optimal values of circuits maximum primary input cone size (N) and minimum fan-out value (F) to determine the number of test vectors, number of partitions and its locations. The ISCAS’85 benchmark circuits have been successfully partitioned, the test results of C499 shows 45% reduction in the test vectors and the experimental results are compared with other partitioning methods, our algorithm makes fewer test vectors.  相似文献   

7.
Regression testing (RT) is an essential but an expensive activity in software development. RT confirms that new faults/errors will not have occurred in the modified program. RT efficiency can be improved through an effective technique of selected only modified test cases that appropriate to the modifications within the given time frame. Earlier, several test case selection approaches have been introduced, but either these techniques were not sufficient according to the requirements of software tester experts or they are ineffective and cannot be used for available test suite specifications and architecture. To address these limitations, we recommend an improved and efficient test case selection (TCS) algorithm for RT. Our proposed technique decreases the execution time and redundancy of the duplicate test cases (TC) and detects only modified changes that appropriate to the modifications in test cases. To reduce execution time for TCS, evaluation results of our proposed approach are established on fault detection, redundancy and already executed test case. Results indicate that proposed technique decreases the inclusive testing time of TCS to execute modified test cases by, on average related to a method of Hybrid Whale Algorithm (HWOA), which is a progressive TCS approach in regression testing for a single product.  相似文献   

8.
We consider the problem of interpolating and zero testing sparse multivariate polynomials over finite fields from their values given by a black box. We give an estimate of the size of a test set constructed by Clausen, Dress, Grabmeier, and Karpinski [2] and improve the previously known lower bounds on the size of a minimal test set. Further, we present for arbitrary finite fields a new interpolation algorithm that uses only evaluations over the ground field, thereby answering an open question of Dür and Grabmeier [3].  相似文献   

9.
The case study in this article is temperature condition modeling between a temperature artifact and a black test corner measuring instrument. The black test corner is an instrument which consists of two wooden walls and a floor, with build-in thermocouples fixed on the back side of the copper disks. The front of the disk is flush with the surface of the board. The black test corner is used for measuring how the temperature of a household appliance is influencing the surroundings in the real environment, e.g., in the kitchen, the living room, etc. The temperature artifact as presented in this article is a specially developed heating plate which is very stable and can be set to different temperatures. Technical standards for conformity assessment usually describe only what should be measured, in some cases also how accurate the measurement should be, but not what kind of measuring instrument should be used. Therefore, it sometimes happens that measurements are performed with improper equipment or in an improper way. For the same level of appliance conformance testing, laboratories shall use the same testing procedures and comparable measuring instruments. This article deals with the analysis of influencing parameters when measuring the temperature rise using the black test corner. Modeling of temperature conditions between a temperature artifact and a black test corner, using commercial modeling software, was performed to find out whether this modeling can be used for detailed evaluation of all possible influencing parameters of the mentioned testing procedure. A scheme and a list of influencing parameters that has to be modeled in the following research is prepared to arrange an optimal experiment.  相似文献   

10.
Reliability growth tests are often used for achieving a target reliability for complex systems via multiple test‐fix stages with limited testing resources. Such tests can be sped up via accelerated life testing (ALT) where test units are exposed to harsher‐than‐normal conditions. In this paper, a Bayesian framework is proposed to analyze ALT data in reliability growth. In particular, a complex system with components that have multiple competing failure modes is considered, and the time to failure of each failure mode is assumed to follow a Weibull distribution. We also assume that the accelerated condition has a fixed time scaling effect on each of the failure modes. In addition, a corrective action with fixed ineffectiveness can be performed at the end of each stage to reduce the occurrence of each failure mode. Under the Bayesian framework, a general model is developed to handle uncertainty on all model parameters, and several special cases with some parameters being known are also studied. A simulation study is conducted to assess the performance of the proposed models in estimating the final reliability of the system and to study the effects of unbiased and biased prior knowledge on the system‐level reliability estimates.  相似文献   

11.
Much of the statistical literature on optimal test planning for accelerated life testing utilize asymptotic methods to derive optimal test plans. While sufficient effort is made to assess the robustness of these test plans to the choice of design parameters and distribution assumptions, there is very little literature on the performance of asymptotic test plans relative to small samples (on the order of 10‐15 samples). An alternative concern is that the asymptotic test plans may not necessarily be the true “optimal” test plan for a given sample size. The purpose of this research is to present exact or “near‐exact” methods for developing test plans and compare the performance of these test plans with corresponding asymptotic test plans in small‐sample settings. The optimal location of design points and sample allocation is determined using each method for lognormal and Weibull lifetime distributions with both complete and Type 1 right‐censored data under two selected acceleration factor models. The investigations reveal that asymptotic test plans tend to corroborate quite well with exact test plans and thus are suitably robust to small‐sample settings in terms of optimal variance.  相似文献   

12.
For paper‐based dry pet food packaging, one of the main requirements is a high resistance against staining from the fat in the product. For both development and quality control, rapid and reliable standardized test procedures assessing this property are needed. Although a number of tests are available, they either apply only to certain types of packaging materials and show limited correlation with field behaviour, or employ non‐standard testing substances, long testing times and complicated equipment. In response to this situation, a new testing procedure that reflects field behaviour but without the drawbacks of the existing tests has been developed. The new test shows high reproducibility and good correlation with field performance for a wide range of multiwall bag and folding box materials with different types of grease resistance treatment. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

13.
One of the ways to determine the inherent reliability of a design is to test it under controlled environments based on the product usage that is understood by the development requirements. This can be accomplished by performing a reliability growth test on the product. A testing approach can be developed that enhances the product reliability and reduces the production testing cycle. Research performed to date points to the fact that this proposed methodology may not exist, and is the focus of continued research to refine the development of an approach to fill this gap. The combining of multiple testing approaches in order to ensure that the reliability requirement is met or exceeded while at the same time having the capability to reduce the testing cycle time when required due to schedule and cost constraints has not been addressed in the open literature till date. The methodology is to utilize a combination of multiple testing approaches to accomplish this task by exploring complementary testing ideas from various technologies that have been utilized previously with documented success. This approach demonstrated that component‐level testing reduced the product‐level failures by greater than 80% while at the same time reducing the schedule to complete all testing. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

14.
风机在暖通空调系统中有着广泛的应用,对空气处理效果影响显著。本文介绍一种多功能风机性能测试系统,该系统包含数据采集系统硬件,基于Windows XP环境,采用Visual Basic6.0开发的高效数据采集与处理软件,可以对多种类型风机进行全性能参数自动采集和处理,并绘制性能曲线,实现一机多用。应用实例表明,该系统工作可靠,操作简便,测试准确度高。  相似文献   

15.
Testing is an integral part of software development. Current fast-paced system developments have rendered traditional testing techniques obsolete. Therefore, automated testing techniques are needed to adapt to such system developments speed. Model-based testing (MBT) is a technique that uses system models to generate and execute test cases automatically. It was identified that the test data generation (TDG) in many existing model-based test case generation (MB-TCG) approaches were still manual. An automatic and effective TDG can further reduce testing cost while detecting more faults. This study proposes an automated TDG approach in MB-TCG using the extended finite state machine model (EFSM). The proposed approach integrates MBT with combinatorial testing. The information available in an EFSM model and the boundary value analysis strategy are used to automate the domain input classifications which were done manually by the existing approach. The results showed that the proposed approach was able to detect 6.62 percent more faults than the conventional MB-TCG but at the same time generated 43 more tests. The proposed approach effectively detects faults, but a further treatment to the generated tests such as test case prioritization should be done to increase the effectiveness and efficiency of testing.  相似文献   

16.
采用数字同步技术的轴类零件尺寸光电检测   总被引:8,自引:0,他引:8  
根据线阵CCD对二维图像进行扫描检测的特点,提出一种基于数字同步技术的轴类零件尺寸检测方法。它用数字方法保证扫描位移量(或者位移速度)与CCD行扫描次数(或者行扫描速度)严格对应,CCD的行扫描由扫描位移量控制,有效消除了被测物体运动速度变化对检测分辨力和精度的影响,提高了检测精度;采用数字同步技术,使检测在扫描位移的加速、恒速和减速过程中均能进行,提高了检测速度;采用图像边缘自动跟踪方法,自动获取边缘参数,实现被测物体的二维多尺寸自动定位检测。检测实验表明,该方法的检测误差≤0.02mm;当被测物体轴向尺寸为100mm时,检测时间<5s。  相似文献   

17.
The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.  相似文献   

18.
Ranking a group of candidate sites and selecting from it the high-risk locations or hotspots for detailed engineering study and countermeasure evaluation is the first step in a transport safety improvement program. Past studies have however mainly focused on the task of applying appropriate methods for ranking locations, with few focusing on the issue of how to define selection methods or threshold rules for hotspot identification. The primary goal of this paper is to introduce a multiple testing-based approach to the problem of selecting hotspots. Following the recent developments in the literature, two testing procedures are studied under a Bayesian framework: Bayesian test with weights (BTW) and a Bayesian test controlling for the posterior false discovery rate (FDR) or false negative rate (FNR). The hypotheses tests are implemented on the basis of two random effect or Bayesian models, namely, the hierarchical Poisson/Gamma or Negative Binomial model and the hierarchical Poisson/Lognormal model. A dataset of highway–railway grade crossings is used as an application example to illustrate the proposed procedures incorporating both the posterior distribution of accident frequency and the posterior distribution of ranks. Results on the effects of various decision parameters used in hotspot identification procedures are discussed.  相似文献   

19.
20.
This work proposes a method for statistical effect screening to identify design parameters of a numerical simulation that are influential to performance while simultaneously being robust to epistemic uncertainty introduced by calibration variables. Design parameters are controlled by the analyst, but the optimal design is often uncertain, while calibration variables are introduced by modeling choices. We argue that uncertainty introduced by design parameters and calibration variables should be treated differently, despite potential interactions between the two sets. Herein, a robustness criterion is embedded in our effect screening to guarantee the influence of design parameters, irrespective of values used for calibration variables. The Morris screening method is utilized to explore the design space, while robustness to uncertainty is quantified in the context of info‐gap decision theory. The proposed method is applied to the National Aeronautics and Space Administration Multidisciplinary Uncertainty Quantification Challenge Problem, which is a black‐box code for aeronautic flight guidance that requires 35 input parameters. The application demonstrates that a large number of variables can be handled without formulating simplifying assumptions about the potential coupling between calibration variables and design parameters. Because of the computational efficiency of the Morris screening method, we conclude that the analysis can be applied to even larger‐dimensional problems. (Approved for unlimited, public release on October 9, 2013, LA‐UR‐13‐27839, Unclassified.) Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号