首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
An acceptance sampling plan is usually determined by minimizing the expectation of the sum of the relevant costs involved. This expected cost minimization approach, however, could result in a great cost at a probability that is unacceptable to a decision maker. This study therefore develops a risk-embedded model via conditional value-at-risk that allows a decision maker to choose an acceptance sampling plan with minimal expected excess cost in accordance with his or her attitude towards risk. We focus in this study on Bayesian acceptance sampling under Type II censoring for Weibull distributed product lifetime with known shape parameter and products being sold with a general rebate warranty. We explore, through numerical analysis, the individual effects and cross-effects of a decision maker’s attitude towards risk and various unit costs on the Bayesian acceptance sampling plans, aiming to gain insights into the role of a decision maker’s risk aversion in the determination of Bayesian acceptance sampling plans.  相似文献   

2.
Sequential sampling represents the logical limit in lot-by-lot multiple acceptance sampling. Sequential plans are generally developed in order to fit a desired operating characteristic curve. The basic premise is that items are observed one at a time and classified as good or bad. Following each sampled item, and based upon the current cumulative results a decision is made to accept the lot, reject the lot, or sample another item from the lot.

Inspection error severely distorts the operating characteristic performance measure of a sequential sampling plan. Yet, we continue to design sequential plans under the assumption of perfect assessment, even though the fact that inspection errors are common is well documented. This paper examines and illustrates the effects of error on the sequential sampling OC curve. These effects are developed mathematically beginning with Wald's formulation of the sequential probability ratio. The results are computerized and a typical sequential plan is analyzed under several type 1 and type 2 inspection error pairs to determine the resulting OC curves.

Additionally, a method of sampling plan design is developed which compensates for the effects of known inspection error. That is, it provides the complete sampling plan which, iffollowed in an error prone environment, yields the OC curve actually desired. This compensating method of sampling plan design has also been computerized. A copy of the entire program appears in the Appendix at the end of the paper.  相似文献   


3.
There is a great deal of literature dealing with the use of the computer in designing acceptance sampling plans. The general approach is to use some approximation technique to generate minimum sample size plans whose OC curves will approximate the desired Producer's Risk and Consumer's Risk levels. Since only rarely will a single such approximation satisfy both (1-) and β requirements some means, either averaging or selection, is used to select a plan. Plans so determined are acceptable but often not optimal, and planes with significantly smaller sample size may exist which are very close to optimal.

This paper reports the development and use of a computer program which may be used to design single sampling plans using either the binomial or Poisson distribution. The program also finds alternate plans with smaller sample size, and gives a measure of the proximity of such alternate plan to optimality. Some rudimentary artificial intelligence techniques are employed in the search and selection of optimal plans and the near-optimal alternative plans.

An extended version of the program also supports experimentation with a variety of criteria of optimality for the selection of candidate plans from those generated.

The main program is written for use under MS/PC-DOS in both Turbo Pascal and Turbo C. The extended version uses only Pascal.  相似文献   


4.
In this paper we have designed an acceptance single sampling plan with inspection errors when the fraction of defective items is a fuzzy number. We have shown that the operating characteristics curve of this plan is like a band having high and low bounds, its width depends on the ambiguity of proportion parameter in the lot when the samples size and acceptance numbers are fixed. A comparison of the single sampling plans with and without inspection errors was done to study the effects upon the characteristics. The results of this comparison show that in the sampling plan with inspection errors, there is a lower operating characteristics band in comparison to a sampling plan without inspection errors for good processing quality. We have also shown that the incorrect classification of a good item reduces the fuzzy probability of acceptance and incorrect classification of a defective item results in a higher fuzzy probability of acceptance.  相似文献   

5.
6.
The decision regarding acceptance or rejection of a lot of products may be considered through variables acceptance sampling plans based on suitable quality characteristics. A variables sampling plan to determine the acceptability of a lot of products based on the lifetime of the products is called reliability acceptance sampling plan (RASP). This work considers the determination of optimum RASP under cost constraint in the framework of hybrid censoring. Weibull lifetime models are considered for illustrations; however, the proposed methodology can be easily extended to any location-scale family of distributions. The proposed method is based on asymptotic results of the estimators of parameters of lifetime distribution. Hence, a Monte Carlo simulation study is conducted in order to show that the sampling plans meet the specified risks for finite sample size.  相似文献   

7.
Generating action sequences to achieve a set of goals is a computationally difficult task. When multiple goals are present, the problem is even worse. Although many solutions to this problem have been discussed in the literature, practical solutions focus on the use of restricted mechanisms for planning or the application of domain dependent heuristics for providing rapid solutions (i.e., domain-dependent planning). One previously proposed technique for handling multiple goals efficiently is to design a planner or even a set of planners (usually domain-dependent) that can be used to generate separate plans for each goal. The outputs are typically either restricted to be independent and then concatenated into a single global plan, or else they are merged together using complex heuristic techniques. In this paper we explore a set of limitations, less restrictive than the assumption of independence, that still allow for the efficient merging of separate plans using straightforward algorithmic techniques.
In particular, we demonstrate that for cases where separate plans can be individually generated, we can define a set of limitations on the allowable interactions between goals that allow efficient plan merging to occur. We propose a set of restrictions that are satisfied across a significant class of planning domains. We present algorithms that are efficient for special cases of multiple plan merging, propose a heuristic search algorithm that performs well in a more general case (where alternative partially ordered plans have been generated for each goal), and describe an empirical study that demonstrates the efficiency of this search algorithm.  相似文献   

8.
Scale parameter(s) of multi-scale hierarchical segmentation (MSHS), which groups pixels as objects in different size and hierarchically organizes them in multiple levels, such as the multiresolution segmentation (MRS) embedded into the eCognition software, directly determines the average size of segmented objects and has significant influences on following geographic object-based image analysis. Recently, some studies have provided solutions to search the optimal scale parameter(s) by supervised strategies (with reference data) or unsupervised strategies (without reference data). They focused on designing metrics indicating better scale parameter(s) but neglected the influences of the linear sampling method of the scale parameter they used as default. Indeed, the linear sampling method not only requires a proper increment and a proper range to balance the accuracy and the efficiency by supervised strategies, but also performs badly in the selection of multiple key scales for the MSHS of complex landscapes by unsupervised strategies. Against these drawbacks, we propose an exponential sampling method. It was based on our finding that the logarithm of the segment count and the logarithm of the scale parameter are linearly dependent, which had been extensively validated on different landscapes in this study. The scale parameters sampled by the exponential sampling method and the linear sampling method with increments 2, 5, 10, 25, and 100 that most former studies used were evaluated and compared by two supervised strategies and an unsupervised strategy. Results indicated that, when searching by the supervised strategies, the exponential sampling method achieved both high accuracy and efficiency where the linear sampling method had to balance them through the experiences of an expert; and when searching by the unsupervised strategy, multiple key scale parameters in MSHS of complex landscapes could be identified among the exponential sampling results, while the linear sampling results hardly achieved this. Considering these two merits, we recommend the exponential sampling method to replace the linear sampling method when searching the optimal scale parameter(s) of MRS.  相似文献   

9.
In manufacturing industries, sampling inspection is a common practice for quality assurance and cost reduction. The basic decisions in sampling inspection are how many manufactured items to be sampled from each lot and how many identified defective items in the sample to accept or reject each lot. Because of the combinatorial nature of alternative solutions on the sample sizes and acceptance criteria, the problem of determining an optimal sampling plan is NP-complete. In this paper, a neurally-inspired approach to generating acceptance sampling inspection plans is proposed. A Bayesian cost model of multi-stage-multi-attribute sampling inspections for quality assurance in serial production systems is formulated. This model can accommodate various dispositions of rejected lott such as scraping and screening. The model also can reflect the relationships between stages and among attributes. To determine the sampling plans based on the formulated model, a neurally-inspired stochastic algorithm is developed. This algorithm simulates the state transition of a primal-dual stochastic neural network to generate the sampling plans. The simulated primal network is responsible for generation of new states whereas the dual network is for recording the generated solutions. Starting with an arbitrary feasible solution, this algorithm is able to converge to a near optimal or an optimal sampling plan with a sequence of monotonically improved solutions. The operating characteristics and performance of the algorithm are demonstratedvia numerical examples.  相似文献   

10.
To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., “strictness for large lot size, toleration for small lot size,” and those of a national standard used specifically for industrial outputs, i.e., “lots with different sizes corresponding to the same sampling plan.”  相似文献   

11.
This paper is concerned with the stabilization problem for a class of switched linear parameter‐varying (LPV) systems with Markovian jump parameters whose transition rate is completely unknown, or only its estimated value is known. Firstly, a new criterion for testing the stochastic stability of such systems is established. Then, using the multiple parameter‐dependent Lyapunov function method, we design a parameter‐dependent state‐feedback controller for individual switched LPV subsystem to guarantee stochastic stability of the closed‐loop switched LPV systems with Markovian jump parameters under uncertain transition rates. Finally, as an application of the proposed design method, the stabilization problem of a turbofan‐engine which cannot be handled by the existing methods is investigated.  相似文献   

12.
查询是数据库系统的主要负载,其效率决定了数据库性能的好坏。一个查询存在多种执行计划,当前,查询优化器只能按照数据库系统的配置参数,静态地为查询选择一个较优的执行计划。并行查询间存在复杂多变的资源争用,很难通过配置参数准确反映,而且同一执行计划在不同情景下的效率并不一致。并行查询下执行计划的选择需考虑查询间的相互影响——查询交互。基于此,提出了一种在并行查询下度量查询受查询交互影响大小的标准QIs。针对并行查询下查询执行计划的选择,还提出了一种动态地为查询选择执行计划的方法TRating,该方法通过比较查询组合中按不同执行计划执行的查询受查询交互影响的大小,选择受查询交互影响较小的执行计划作为该查询的较优执行计划。实验结果表明,TRating方法为查询选择较优执行计划的准确率达61%,相比查询优化器提高了25%;而且在为查询选择次优执行计划时,其准确率也高达69%。  相似文献   

13.
In life tests, the progressive Type-II censoring methodology allows for the possibility of censoring a number of units each time a failure is observed. This results in a large number of possible censoring plans, depending on the number of both censoring times and censoring numbers. Employing maximum Fisher Information as an optimality criterion, optimal plans for a variety of lifetime distributions are determined numerically. In particular, exact optimal plans are established for some important lifetime distributions. While for some distributions, Fisher information is invariant with respect to the censoring plan, results for other distributions lead us to hypothesize that the optimal scheme is in fact always a one-step method, restricting censoring to exactly one point in time. Depending on the distribution and its parameters, this optimal point of censoring can be located at the end (right censoring) or after a certain proportion of observations. A variety of distributions is categorized accordingly. If the optimal plan is a one-step censoring scheme, the optimal proportion is determined. Moreover, the Fisher information as well as the expected time till the completion of the experiment for the optimal one-step censoring plan are compared with the respective quantities of both right censoring and simple random sampling.  相似文献   

14.
A method is presented for the robust design of flexible manufacturing systems (FMS) that undergo the forecasted product plan variations. The resource allocation and the operation schedule of a FMS are modeled as a colored Petri net and an associated transition firing sequence. The robust design of the colored Petri net model is formulated as a multi-objective optimization problem that simultaneously minimizes the production costs under multiple production plans (batch sizes for all jobs), and the reconfiguration cost due to production plan changes. A genetic algorithm, coupled with the shortest imminent operation time (SIO) dispatching rule, is used to simultaneously find the near-optimal resource allocation and the event-driven schedule of a colored Petri net. The resulting Petri net is then compared with the Petri nets optimized for a particular production plan in order to address the effectiveness of the robustness optimization. The simulation results suggest that the proposed robustness optimization scheme should be considered when the products are moderately different in their job specifications so that optimizing for a particular production plan creates inevitably bottlenecks in product flow and/or deadlock under other production plans.  相似文献   

15.
In life tests, the progressive Type-II censoring methodology allows for the possibility of censoring a number of units each time a failure is observed. This results in a large number of possible censoring plans, depending on the number of both censoring times and censoring numbers. Employing maximum Fisher Information as an optimality criterion, optimal plans for a variety of lifetime distributions are determined numerically. In particular, exact optimal plans are established for some important lifetime distributions. While for some distributions, Fisher information is invariant with respect to the censoring plan, results for other distributions lead us to hypothesize that the optimal scheme is in fact always a one-step method, restricting censoring to exactly one point in time. Depending on the distribution and its parameters, this optimal point of censoring can be located at the end (right censoring) or after a certain proportion of observations. A variety of distributions is categorized accordingly. If the optimal plan is a one-step censoring scheme, the optimal proportion is determined. Moreover, the Fisher information as well as the expected time till the completion of the experiment for the optimal one-step censoring plan are compared with the respective quantities of both right censoring and simple random sampling.  相似文献   

16.
The development of attribute sampling plans to meet individual users' requirements (as specified by AQL and LTPD values) is conceptually easy to do. However, the mechanics of the procedure can involve excessive computation or table look-ups. These tasks are easily delegated to a micro-computer. In addition, the powerful graphics routines available on TRS-80 and APPLE computers can lend considerable perspective to the problem of choosing an acceptable sampling plan by displaying the entire operating characteristic curves of candidate plans.This paper discusses the use of micro-computers and computer graphics in the design of single and double attribute sampling plans. A program written in BASIC for the Apple III micro-computer is detailed and its capabilities illustrated by an example.  相似文献   

17.
In this paper we propose Hoare style proof systems called PR0Dand PRKW0Dfor plan generation and plan verification under 0-approximation semantics of the action language AK.In PR0D(resp.PRKW0D),a Hoare triple of the form{X}c{Y}(resp.{X}c{KWp})means that all literals in Y become true(resp.p becomes known)after executing plan c in a state satisfying all literals in X.The proof systems are shown to be sound and complete,and more importantly,they give a way to efficiently generate and verify longer plans from existing verified shorter plans by applying so-called composition rule,provided that an enough number of shorter plans have been properly stored.The idea behind is a tradeoff between space and time,we refer it to off-line planning and point out that it could be applied to general planning problems.  相似文献   

18.
sql语句调优是数据库性能调优的重要方面.要达到同样的执行结果,sql语句有多种写法,不同的写法其性能差别很大.即使同一个sql语句,oracle也有多种途径去执行,即有多个执行计划.oracle比较这多个执行计划的性能优劣,耗费资源多少,来选择最优的执行计划.oracle在评估各个执行计划的性能时,需要借助sql语句执行的环境,即统计信息,来计算出每个执行计划耗费资源的多少.因此,尽可能收集准确的统计信息,对于oracle能否选择最优的执行计划,至关重要.其中,直方图的收集与否起着很重要的作用.本文通过实验来验证直方图对sql执行计划的影响,从而明确何种情况下需要收集直方图.  相似文献   

19.
Coordinated execution of tasks in a multiagent environment   总被引:1,自引:0,他引:1  
This correspondence describes the application of discrete event control methods to provide conflict-free plan execution in a multiagent environment. This work uses planning methods to generate plans for multiple robots, and the plans are then compiled into Petri nets for analysis, execution, and monitoring. Supervisory control techniques are applied to the Petri net controller for the purpose of dealing with conflicts that arise due to the presence of shared resources. Furthermore, by preserving the state of the system replanning can occur at any time during execution to deal with unforeseen events.  相似文献   

20.
Facing current environment full of a variety of small quantity customized requests, enterprises must provide diversified products for speedy and effective responses to customers’ requests. Among multiple plans of product, both assembly sequence planning (ASP) and assembly line balance (ALB) must be taken into consideration for the selection of optimal product plan because assembly sequence and assembly line balance have significant impact on production efficiency. Considering different setup times among different assembly tasks, this issue is an NP-hard problem which cannot be easily solved by general method. In this study the multi-objective optimization mathematical model for the selection of product plan integrating ASP and ALB has been established. Introduced cases will be solved by the established model connecting to database statistics. The results show that the proposed Guided-modified weighted Pareto-based multi-objective genetic algorithm (G-WPMOGA) can effectively solve this difficult problem. The results of comparison among three different kinds of hybrid algorithms show that in terms of the issues of ASP and ALB for multiple plans, G-WPMOGA shows better problem-solving capability for four-objective optimization.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号