首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 36 毫秒
1.
This paper presents a simple method for obtaining exact lower confidence bounds for reliabilities (tail probabilities) for items whose life times follow a Weibull distribution where both the “shape” and “scale” parameters are unknown. These confidence bounds are obtained both for the censored and non-censored cases and are asymptotically efficient. They are exact even for small sample sizes in that they attain the desired confidence level precisely. The case of an additional unknown “location” or “shift” parameter is also discussed in the large sample case. Tables are given of exact and asymptotic lower confidence bounds for the reliability for sample sizes of 10, 15, 20, 30, 50 and 100 for various censoring fractions.  相似文献   

2.
In this paper, we propose degradation test sampling plans (DTSPs) used to determine the acceptability of a product in a Wiener process model. A test statistic and the acceptance criterion based on Wiener process parameter estimates are proposed. The design of a degradation test is investigated using a model incorporating test cost constraint to minimize the asymptotic variance of the proposed test statistic. Some important variables, including the sample size, measurement frequency, and the total test time, are chosen as decision variables in a degradation test plan. The asymptotic variance of the test statistic and the approximate functional forms of the optimal solutions are derived. A search algorithm is also represented in a flow chart for the purpose of finding the optimal DTSPs. In addition, we assess the minimum cost requirement for the result of the test procedure to satisfy the minimum requirements for the producer's risk and the consumer's risk. When the given test budget is not large enough, we suggest some methods to find appropriate solutions. Finally, a numerical example is used to illustrate the proposed methodology. Optimum DTSPs are obtained and tabulated for some combinations of commonly used producer and consumer risk requirements. A sensitivity analysis is also conducted to investigate the sensitivity of the obtained DTSPs to the cost parameters used.  相似文献   

3.
Accelerated testing has been widely used for several decades. Started with accelerated life tests with constant‐stress loadings, more interest has been focused prominently on accelerated degradation tests and time‐varying stress loadings. Because accelerated testing is crucial to the assessment of product reliability and the design of warranty policy, it is important to develop an efficacious test plan that encompasses and addresses important issues, such as design of stress profiles, sample allocation, test duration, measurement frequency, and budget constraint. In recent years, extensive research has been conducted on the optimal design of accelerated testing plans, and the consideration of multiple stresses with interactions has become a big challenge in such experimental designs. The purpose of this study is to provide a comprehensive review of important methods for statistical inference and optimal design of accelerated testing plans by compiling the existing body of knowledge in the area of accelerated testing. In this work, different types of test planning strategies are categorized, and their drawbacks and the research trends are provided to assist researchers and practitioners in conducting new research in this area.  相似文献   

4.
《技术计量学》2013,55(3):242-249
In industry, one sometimes compares a sample mean and minimum, or a mean and maximum, to reference values to determine whether a lot should be accepted. Particularly prominent examples of such procedures are “Category B” sampling plans for checking the net contents of packaged goods. Because the exact joint distribution of an extremum and the mean of a sample is usually complicated, establishing these reference values using statistical considerations typically involves crude approximations or simulation, even under the assumption of normality. The purpose of this article is to use the saddlepoint method to develop a fairly simple and very accurate approximation to the joint cumulative distribution function (cdf) of the mean and an extremum of a normal sample. This approximation can be used to establish statistically based acceptance criteria or to evaluate the performance of sampling plans based on criteria derived in other ways. These uses are illustrated with examples.  相似文献   

5.
Product tests are generally intended to serve two purposes, first to expose those mechanisms which are likely to prove troublesome so that they may be eliminated and second to provide estimat)es of the performance parameters of the product. In this paper, systems in which the failure rate is the significant performance parameter and in which failures constitute a Poisson process are considered. It is assumed that every system failure may be classified as “correctable” or “intrinsic” and that every correctable failure which occurs during test leads to elimination of the mechanisms which caused it. Optimal test plans are developed based on the purposes of the test and the economic factors of importance to the manufacturer.  相似文献   

6.
Neutrophil dysfunction is strongly linked to type 2 diabetes mellitus (T2DM) pathophysiology, but the prognostic potential of neutrophil biomarkers remains largely unexplored due to arduous leukocyte isolation methods. Herein, a novel integrated microdevice is reported for single‐step neutrophil sorting and phenotyping (chemotaxis and formation of neutrophil extracellular traps (NETosis)) using small blood volumes (fingerprick). Untouched neutrophils are purified on‐chip from whole blood directly using biomimetic cell margination and affinity‐based capture, and are exposed to preloaded chemoattractant or NETosis stimulant to initiate chemotaxis or NETosis, respectively. Device performance is first characterized using healthy and in vitro inflamed blood samples (tumor necrosis factor alpha, high glucose), followed by clinical risk stratification in a cohort of subjects with T2DM. Interestingly, “high‐risk” T2DM patients characterized by severe chemotaxis impairment reveal significantly higher C‐reactive protein levels and poor lipid metabolism characteristics as compared to “low‐risk” subjects, and their neutrophil chemotaxis responses can be mitigated after in vitro metformin treatment. Overall, this unique and user‐friendly microfluidics immune health profiling strategy can significantly aid the quantification of chemotaxis and NETosis in clinical settings, and be further translated into a tool for risk stratification and precision medicine methods in subjects with metabolic diseases such as T2DM.  相似文献   

7.
The D‐optimality criterion is often used in computer‐generated experimental designs when the response of interest is binary, such as when the attribute of interest can be categorized as pass or fail. The majority of methods in the generation of D‐optimal designs focus on logistic regression as the base model for relating a set of experimental factors with the binary response. Despite the advances in computational algorithms for calculating D‐optimal designs for the logistic regression model, very few have acknowledged the problem of separation, a phenomenon where the responses are perfectly separable by a hyperplane in the design space. Separation causes one or more parameters of the logistic regression model to be inestimable via maximum likelihood estimation. The objective of this paper is to investigate the tendency of computer‐generated, nonsequential D‐optimal designs to yield separation in small‐sample experimental data. Sets of local D‐optimal and Bayesian D‐optimal designs with different run (sample) sizes are generated for several “ground truth” logistic regression models. A Monte Carlo simulation methodology is then used to estimate the probability of separation for each design. Results of the simulation study confirm that separation occurs frequently in small‐sample data and that separation is more likely to occur when the ground truth model has interaction and quadratic terms. Finally, the paper illustrates that different designs with identical run sizes created from the same model can have significantly different chances of encountering separation.  相似文献   

8.
针对某型雷达板级的性能往往受到多个应力的影响,且在加速退化试验中该产品有限试验时间内难以获得大量性能退化信息的问题,提出一种雷达板级双应力交叉步降加速退化试验优化设计方法。采用Monte-Carlo对加速试验进行仿真模拟,在样本量大小一定的条件下,以监测频率、应力水平数、监测次数作为设计变量,以总的试验费用作为约束条件,以正常使用应力下的p阶分位寿命渐进方差估计作为目标函数,建立下双应力交叉步降加速退化试验优化设计模型。通过仿真实例,验证了该方法的有效性、可行性。  相似文献   

9.
This paper presents the best compromise three-level constant stress accelerated life test plans for Weibull distributions with different censoring times. The best compromise test plans choose the stress levels, test units allocated to each stress and censoring times to simultaneously minimize the asymptotic variance of the MLE of the mean log life at design stress and total running time. This paper also compares the test plans with existing three-level test plans and concludes that the best compromise test plans (1) have less asymptotic variances, (2) are more robust and (3) are more cost-effective, than the existing three-level test plans.  相似文献   

10.
In this paper, we propose 3 new sampling plans, including resubmitted single sampling plan (RSSP), repetitive group sampling (RGS) plan, and multiple dependent state (MDS) sampling plan to study the zero‐inflated negative binomial distribution in microbiological food safety and quality assurance practices. The unity value approach is used to find optimal plan parameters. The proposed plans are compared with the single sampling plan (SSP). We found that degree of clustering and excess of zeros counts affect the performance of all sampling plans. The MDS plan outperforms the SSP, RSSP, and RGS plans with respect to minimum average sample number in most cases. Both RGS and MDS plans show a comparable performance. The average run length is calculated to evaluate the rejection capability of the plans and signal the deterioration of the quality of lots. An example from 9 Irish abattoirs is used to illustrate the application of the proposed methods.  相似文献   

11.
In many fields, there is the need to monitor quality characteristics defined as the ratio of two random variables. The design and implementation of control charts directly monitoring the ratio stability is required for the continuous surveillance of these quality characteristics. In this paper, we propose two one‐sided exponentially weighted moving average (EWMA) charts with subgroups having sample size n > 1 to monitor the ratio of two normal random variables. The optimal EWMA smoothing constants, control limits, and ARLs have been computed for different values of the in‐control ratio and correlation between the variables and are shown in several figures and tables to discuss the statistical performance of the proposed one‐sided EWMA charts. Both deterministic and random shift sizes have been considered to test the two one‐sided EWMA charts' sensitivity. The obtained results show that the proposed one‐sided EWMA control charts are more sensitive to process shifts than other charts already proposed in the literature. The practical application of the proposed control schemes is discussed with an illustrative example. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

12.
Over the past three decades, significant research efforts have focused on improving the charge carrier mobility of organic thin‐film transistors (OTFTs). In recent years, a commonly observed nonlinearity in OTFT current–voltage characteristics, known as the “kink” or “double slope,” has led to widespread mobility overestimations, contaminating the relevant literature. Here, published data from the past 30 years is reviewed to uncover the extent of the field‐effect mobility hype and identify the progress that has actually been achieved in the field of OTFTs. Present carrier‐mobility‐related challenges are identified, finding that reliable hole and electron mobility values of 20 and 10 cm2 V?1 s?1, respectively, have yet to be achieved. Based on the analysis, the literature is then reviewed to summarize the concepts behind the success of high‐performance p‐type polymers, along with the latest understanding of the design criteria that will enable further mobility enhancement in n‐type polymers and small molecules, and the reasons why high carrier mobility values have been consistently produced from small molecule/polymer blend semiconductors. Overall, this review brings together important information that aids reliable OTFT data analysis, while providing guidelines for the development of next‐generation organic semiconductors.  相似文献   

13.
In this article we compute the expected Fisher information and the asymptotic variance–covariance matrix of the maximum likelihood estimates based on a progressively type II censored sample from a Weibull distribution by direct calculation as well as the missing-information principle. We then use these values to determine the optimal progressive censoring plans. Three optimality criteria are considered, and some selected optimal progressive censoring plans are presented according to these optimality criteria. We also discuss the construction of progressively censored reliability sampling plans for the Weibull distribution. Three illustrative examples are provided with discussion.  相似文献   

14.
With the rapid technological advances, products are becoming more reliable. Then, multistress accelerated life testing (MALT) has been adopted in engineering to obtain failure information in a limited time. In order to make the testing procedure more efficient, it is necessary to better design the test plan. However, to date, relevant research on planning of MALT is limited. Multiple stresses will lead to plenty of stress-level combinations that require too much cost and time to implement. Besides, there may be interactions among multiple stresses, which need more experiments for parameter estimation. To solve these problems, we propose a novel planning method for constant-stress MALT under lognormal distribution using D-optimal design, which can reduce required test points efficiently and deal with second-order effects in models. In ALT, the log-linear model is often used to describe the life-stress relationship. Hence, D-optimal design is adopted in this paper to select test points from the whole test region. Then, optimal unit allocation plans are formulated under V/D-optimality criterion, respectively, where type I and type II censoring are both discussed. A real case of light-emitting device (LED) is presented to compare the proposed approach with other two existing methods. The results show that the proposed method performs better than other two existing methods both in prediction accuracy and estimation precision. Moreover, a sensitivity analysis reveals the robustness of the optimal plans determined by the proposed method.  相似文献   

15.
Sequential tolerance control (STC) is a tolerance control methodology used in discrete parts manufacturing. Recently, an adaptive sphere‐fitting method for STC (ASF–STC) was developed to account for potential skewness in manufacturing operations' distributions, a factor not considered in conventional STC. ASF–STC offers significant improvements over conventional STC when such skewness exists. The direction of skewness of an operations' distribution is a necessary input to ASF–STC. Thus, a novel approach to determining the skewness of a distribution for small sample sizes is presented here. ASF–STC has an additional requirement of distribution information for each operation. The beta distribution is an ideal candidate here, as it is very flexible in shape. The literature on four‐parameter beta estimation is very limited, and their performance for small sample sizes is poor. STC was designed for low‐volume production, thus the estimation for small sample sizes is necessary here. This study presents a heuristic, based on the method of moments estimates for a beta distribution, that estimates the four parameters for a beta distribution with small sample size. Several computational results are provided to compare this heuristic to the best‐known procedure, with the heuristic found to perform better for the test problems considered. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

16.
Monitoring times between events (TBE) is an important aspect of process monitoring in many areas of applications. This is especially true in the context of high‐quality processes, where the defect rate is very low, and in this context, control charts to monitor the TBE have been recommended in the literature other than the attribute charts that monitor the proportion of defective items produced. The Shewhart‐type t‐chart assuming an exponential distribution is one chart available for monitoring the TBE. The t‐chart was then generalized to the tr‐chart to improve its performance, which is based on the times between the occurrences of r (≥1) events. In these charts, the in‐control (IC) parameter of the distribution is assumed known. This is often not the case in practice, and the parameter has to be estimated before process monitoring and control can begin. We propose estimating the parameter from a phase I (reference) sample and study the effects of estimation on the design and performance of the charts. To this end, we focus on the conditional run length distribution so as to incorporate the ‘practitioner‐to‐practitioner’ variability (inherent in the estimates), which arises from different reference samples, that leads to different control limits (and hence to different IC average run length [ARL] values) and false alarm rates, which are seen to be far different from their nominal values. It is shown that the required phase I sample size needs to be considerably larger than what has been typically recommended in the literature to expect known parameter performance in phase II. We also find the minimum number of phase I observations that guarantee, with a specified high probability, that the conditional IC ARL will be at least equal to a given small percentage of a nominal IC ARL. Along the same line, a lower prediction bound on the conditional IC ARL is also obtained to ensure that for a given phase I sample, the smallest IC ARL can be attained with a certain (high) probability. Summary and recommendations are given. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

17.
Accelerated life testing (ALT) is the process of testing products by subjecting them to strict conditions in order to observe more failure data in a short time period. In this study, we compare the methods of two-level constant-stress ALT (CSALT) and simple step-stress ALT (SSALT) based on competing risks of two or more failure modes with independent exponential lifetime distributions. Optimal sample size allocation during CSALT and the optimal stress change-time in SSALT are considered based on V- and D-optimality , respectively. Under Type-I censoring, numerical results show that the optimal SSALT outperforms the optimal CSALT in a wide variety of settings. We theoretically also show that the optimal SSALT is better than the optimal CSALT under a set of conditions. A real data example is analyzed to demonstrate the performance of the optimal plans for both ALTs.  相似文献   

18.
A variety of design‐process and design‐methods courses exist in engineering education. The primary objective of such courses is to teach engineering design fundamentals utilizing repeatable design techniques. By so doing, students obtain (1) tools they may employ during their education, (2) design experiences to understand the “big picture” of engineering, and (3) proven methods to attack open‐ended problems. While these skills are worthwhile, especially as design courses are moved earlier in curricula, many students report that design methods are typically taught at a high‐level and in a compartmentalized fashion. Often, the students' courses do not include opportunities to obtain incremental concrete experiences with the methods. Nor do such courses allow for suitable observation and reflection as the methods are executed. In this paper, we describe a new approach for teaching design methods that addresses these issues. This approach incorporates hands‐on experiences through the use of “reverse‐engineering” projects. As the fundamentals of design techniques are presented, students immediately apply the methods to actual, existing products. They are able to hold these products physically in their hands, dissect them, perform experiments on their components, and evolve them into new successful creations. Based on this reverse‐engineering concept, we have developed and tested new courses at The University of Texas, MIT, and the United States Air Force Academy. In the body of this paper, we present the structure of these courses, an example of our teaching approach, and an evaluation of the results.  相似文献   

19.
This article concerns the optimization of measurement plans in the design of bivariate degradation tests for bivariate Wiener processes. After describing an unbalanced measurement scheme for bivariate degradation tests, we derive the likelihood function and provide a method for estimating the model parameters that is based on maximum likelihood and least squares. From the corresponding Fisher information matrix, we deduce an important insight, namely that longer degradation tests and longer intervals between measurements in the test design result in more precise parameter estimates. We introduce a model for optimizing the degradation test measurement plan that incorporates practical constraints and objectives in the test design framework. We also present a search‐based algorithm to identify the optimal test measurement plan that is based on the aforementioned measurement rule. Via a simulation study and a case study involving the Rubidium Atomic Frequency Standard, we demonstrate the characteristics of optimal measurement plans for bivariate degradation test design and show the superiority of longer duration tests involving fewer samples compared to alternative designs that specify testing more samples over shorter periods of time. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

20.
We examine and compare the finite sample performance of the competing back-fitting and integration methods for estimating additive nonparametric regression using simulated data. Although, the asymptotic properties of the integration estimator, and to some extent the backfitting, method too, are well understood, its small sample properties are not well investigated. Apart from some small experiments in the above cited papers, there is little hard evidence concerning the exact distribution of the estimates. It is our purpose to provide an extensive finite sample comparison between the backfitting procedure and the integration procedure using simulated data. The research was supported by the National Science Foundation, NATO, and Deutsche Forschungsmemeinschaft, SFB 373.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号