首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
To implement an effective and efficient quality system in a network of established environmental testing laboratories requires a committed long-term effort that is potentially fraught with multiple obstacles. This presentation discusses one state's ongoing efforts at implementing such a system. First is the need to convince management of the rationale for a quality systems-based approach versus the traditional QA/QC program. Once development of a quality system has been sanctioned, a team-based approach utilizing project planning tools is a good way to approach the effort. Resources are assigned to the development of key quality system components, and generally a phased-deployment or roll-out works best. Once implementation is underway, assuring operational utilization and compliance with the quality system are vital steps in the process. Important to successful implementation is ongoing assessment and refinement of the quality system. Fundamental and key elements of the laboratory quality system are numerous and need to work in concert with each other. Quality system elements to be discussed in the presentation range from management and QA roles and functions to the typical documentation of laboratory policies and procedures. Numerous QA assessment tools and other vital quality system practices that play an important role in making a complete quality system are addressed. In addition, efforts must be undertaken to integrate the laboratory quality system with other management systems within the organization. The bottom line is that all environmental laboratories need a quality system more now than ever. Data users need it. Customers' expectations for data quality are high. USEPA policy and/or programs call for it. Additionally, good quality systems can benefit the organization in multiple ways and help avoid the "pay-me-now or pay-me-later" syndrome. In conclusion, all environmental testing laboratories (i.e., academic, private, commercial and especially governmental) need to invest in and implement a quality system based on a recognized standard (e.g., NELAC, ISO 17025, ANSI/ASQC E-4). The author recommends pursuing NELAP laboratory accreditation with a NELAP-recognized accrediting authority.  相似文献   

2.
The Corps of Engineers works with local restoration advisory boards (RAB) to exchange information and develop plans for restoration of closed military bases for civilian reuse. Meetings of the RAB to discuss progress in environmental assessment and restoration of former defense sites can be contentious due to the complex technical nature of the information to be shared and the personal stake that the members of the community have in ensuring that contentious areas are restored for safe use. A prime concern of community representatives is often the quality of the data used to make environmental decisions. Laboratory case narratives and data flags may suggest laboratory errors and low data quality to those without an understanding of the information's full meaning. RAB members include representatives from local, state, and tribal governments, the Department of Defense, the Environmental Protection Agency, and the local community. The Corps of Engineers representatives usually include project technical and management personnel, but these individuals may not have sufficient expertise in the project quality assurance components and laboratory data quality procedures to completely satisfy community concerns about data quality. Communication of this information to the RAB by a quality assurance professional could serve to resolve some of the questions members have about the quality of acquired data and proper use of analytical results, and increase community trust that appropriate decisions are made regarding restoration. Details of the effectiveness of including a quality assurance professional in RAB discussions of laboratory data quality and project quality management are provided in this paper.  相似文献   

3.
In real life, decisions are usually made by comparing different options with respect to several, often conflicting criteria. This requires subjective judgements on the importance of different criteria by DMs and increases uncertainty in decision making. This article demonstrates how uncertainty can be handled in multi-criteria decision situations using Compromise Programming, one of the Multi-criteria Decision Analysis (MCDA) techniques. Uncertainty is characterised using a probabilistic approach and propagated using a Monte Carlo simulation technique. The methodological approach is illustrated on a case study which compares the sustainability of two options for electricity generation: coal versus biomass. Different models have been used to quantify their sustainability performance for a number of economic, environmental and social criteria. Three cases are considered with respect to uncertainty: (1) no uncertainty, (2) uncertainty in data/models and (3) uncertainty in models and decision-makers’ preferences. The results shows how characterising and propagating uncertainty can help increase the effectiveness of multi-criteria decision making processes and lead to more informed decision.  相似文献   

4.
This article proposes a simple strategy for establishing sensitivity requirements (quantitation limits) for environmental chemical analyses when the primary data quality objective is to determine if a contaminant of concern is greater or less than an action level (e.g., an environmental "cleanup goal," regulatory limit, or risk-based decision limit). The approach assumes that the contaminant concentrations are normally distributed with constant variance (i.e., the variance is not significantly dependent upon concentration near the action level). When the total or "field" portion of the measurement uncertainty can be estimated, the relative uncertainty at the laboratory's quantitation limit can be used to determine requirements for analytical sensitivity. If only the laboratory component of the total uncertainty is known, the approach can be used to identify analytical methods or laboratories that will not satisfy objectives for sensitivity (e.g., when selecting methodology during project planning).  相似文献   

5.
EPA's Great Lakes National Program Office (GLNPO) is leading one of the most extensive studies of a lake ecosystem ever undertaken. The Lake Michigan Mass Balance Study (LMMB Study) is a coordinated effort among state, federal, and academic scientists to monitor tributary and atmospheric pollutant loads, develop source inventories of toxic substances, and evaluate the fate and effects of these pollutants in Lake Michigan. A key objective of the LMMB Study is to construct a mass balance model for several important contaminants in the environment: PCBs, atrazine, mercury, and trans-nonachlor. The mathematical mass balance models will provide a state-of-the-art tool for evaluating management scenarios and options for control of toxics in Lake Michigan. At the outset of the LMMB Study, managers recognized that the data gathered and the model developed from the study would be used extensively by data users responsible for making environmental, economic, and policy decisions. Environmental measurements are never true values and always contain some level of uncertainty. Decision makers, therefore, must recognize and be sufficiently comfortable with the uncertainty associated with data on which their decisions are based. The quality of data gathered in the LMMB was defined, controlled, and assessed through a variety of quality assurance (QA) activities, including QA program planning, development of QA project plans, implementation of a QA workgroup, training, data verification, and implementation of a standardized data reporting format. As part of this QA program, GLNPO has been developing quantitative assessments that define data quality at the data set level. GLNPO also is developing approaches to derive estimated concentration ranges (interval estimates) for specific field sample results (single study results) based on uncertainty. The interval estimates must be used with consideration to their derivation and the types of variability that are and are not included in the interval.  相似文献   

6.
Field-portable test methods may be quantitative, semi-quantitative, or qualitative and screening methods are often used in the field to determine if the concentration of a toxic substance exceeds regulatory or recommended standards or action levels. For on-site analysis, accurate quantitative tests for field measurements may not be available, depending on the analyte(s) or specific field situation. Thus, in lieu of more definitive test methods, screening tests which are based on qualitative or semi-quantitative methods are often used for making immediate decisions in the field, e.g. for compliance or risk assessment. Also, quantitative methods may be used for screening purposes in many instances. To ensure the quality of these screening tests and the decisions that are made based upon their results, screening methods need to be evaluated with sufficient data and should meet basic performance criteria prior to their being employed for decision-making purposes. Although quantitative, semi-quantitative and qualitative methods demonstrate different characteristics, it is desired that the performance criteria for all three method categories be consistent. If there is consistency, then one can have a sound basis for selecting the most appropriate test(s) for a given application. In order to unify the performance criteria for the different types of methods, a performance function is used to characterise both qualitative and semi-quantitative methods; in turn, this performance function is related to that for quantitative methods. False negative rates, false positive rates, sensitivity and specificity are key characteristics of screening methods that can be determined from the pertinent performance curves. The performance characteristics of each method are related to the uncertainty region that is associated with each method and the applicable uncertainty regions can be gleaned from the performance curves. Also, various options for using multiple test results to improve decisions based on test results are provided.  相似文献   

7.
An increasing emphasis on chemical process safety over the last two decades has led to the development and application of powerful risk assessment tools. Hazard analysis and risk evaluation techniques have developed to the point where quantitatively meaningful risks can be calculated for processes and plants. However, the results are typically presented in semi-quantitative "ranked list" or "categorical matrix" formats, which are certainly useful but not optimal for making business decisions. A relatively new technique for performing valuation under uncertainty, value at risk (VaR), has been developed in the financial world. VaR is a method of evaluating the probability of a gain or loss by a complex venture, by examining the stochastic behavior of its components. We believe that combining quantitative risk assessment techniques with VaR concepts will bridge the gap between engineers and scientists who determine process risk and business leaders and policy makers who evaluate, manage, or regulate risk. We present a few basic examples of the application of VaR to hazard analysis in the chemical process industry.  相似文献   

8.
Issues related to improvement in the quality of products and to environmental protection in the economic policy of many countries and in the strategies of institutions and international organisations (e.g. European Union) have increased in importance in recent years as a consequence of the increase in environmental awareness of consumers. All these institutions currently recommend a comprehensive assessment of the effectiveness of planned projects during the decision-making process taking into account both economic and environmental factors. It is, therefore, important to develop methods and tools to assess environmental performance as a support to a proper choice of investment activities. The aim of this paper is to develop algorithms to link the life cycle assessment (LCA) model associated with environmental issues and the life cycle cost analysis (LCCA) model associated with economic factors to permit an integrated assessment of investment projects. The combination of LCA and LCCA results enables the assessment of ongoing or planned investments and should be used as a priority in making strategic decisions. In this paper, three environmentally friendly pathways (algorithms) using LCA–LCCA indicators as a support for decision-making processes were proposed: the first for implementing any environmental investments, the second for modernisation and innovation investments, and the third for new investments.  相似文献   

9.
One of the major challenges in remediating contaminated sites is having quick access to quality data on which to base remedial decisions as onsite work progresses. Case studies are presented at two Superfund sites where field screening and field analyses are used to provide these data. Emphasis is placed on the importance of high quality field data, as these data are the basis for remedial decisions prior to receipt of offsite laboratory confirmation. The decision-making processes for remediating contaminated soils and structures are presented in addition to project specifics including data quality objectives, field data collection procedures, quality assurance/quality control procedures, and comparisons of the field data with offsite laboratory results.  相似文献   

10.
Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made.This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network.This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.  相似文献   

11.
We analyze the components of uncertainty in decision making and consider a risk evaluation model in decision making in a risk situation in economic activities. A modified cost-benefit criterion is proposed for decision making for implementing a project, taking into account the average losses due to wrong decisions. __________ Translated from Izmeritel'naya Tekhnika, No. 9, pp. 27–29, September, 2005.  相似文献   

12.
Reducing process variability is presently an area of much interest in manufacturing organizations. Programmes such as Six Sigma robustly link the financial performance of the organization to the degree of variability present in the processes and products of the organization. Data, and hence measurement processes, play an important part in driving such programmes and in making key manufacturing decisions. In many organizations, however, little thought is given to the quality of the data generated by such measurement processes. By using potentially flawed data in making fundamental manufacturing decisions, the quality of the decision‐making process is undermined and, potentially, significant costs are incurred. Research in this area is sparse and has concentrated on the technicalities of the methodologies available to assess measurement process capability. Little work has been done on how to operationalize such activities to give maximum benefit. From the perspective of one automotive company, this paper briefly reviews the approaches presently available to assess the quality of data and develops a practical approach, which is based on an existing technical methodology and incorporates simple continuous improvement tools within a framework which facilitates appropriate improvement actions for each process assessed. A case study demonstrates the framework and shows it to be sound, generalizable and highly supportive of continuous improvement goals. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

13.
The main objective of the EVEREST project is the evaluation of the sensitivity of the radiological consequences associated with the geological disposal of radioactive waste to the different elements in the performance assessment. Three types of geological host formations are considered: clay, granite and salt. The sensitivity studies that have been carried out can be partitioned into three categories according to the type of uncertainty taken into account: uncertainty in the model parameters, uncertainty in the conceptual models and uncertainty in the considered scenarios. Deterministic as well as stochastic calculational approaches have been applied for the sensitivity analyses. For the analysis of the sensitivity to parameter values, the reference technique, which has been applied in many evaluations, is stochastic and consists of a Monte Carlo simulation followed by a linear regression. For the analysis of conceptual model uncertainty, deterministic and stochastic approaches have been used. For the analysis of uncertainty in the considered scenarios, mainly deterministic approaches have been applied.  相似文献   

14.
This study employs fuzzy logic to evaluate uncertain component end-of-life (EOL) options in the design stage. Determining EOL strategies during the product design stage can be complex. For example, EOL strategies for retired bicycle components are various and may change with geographic location. Thus, adopting fixed EOL strategies in the product design stage may not always be appropriate; the element of uncertainty should be considered. Limited research has examined uncertainty of EOL strategies during the design stage. Moreover, the evaluation of EOL strategies in a comprehensive manner has not been shown in a realistic case study. These facts motivate this investigation. Fourteen evaluation criteria are used to generate a comprehensive framework for assessing seven EOL strategies. The evaluation process generates the likelihood for each of these strategies by aggregating fuzzy set operations and a left–right fuzzy ranking method. Using SUMPRODUCT calculation for these weights/probabilities and input sustainability value (i.e., cost, environmental impact and labor time), expected values are derived to represent the sustainability values for each EOL strategy. A Technique-for-Order-of-Preference-by-Similarity-to-Ideal-Solution (TOPSIS) based method is employed to identify the appropriate EOL strategy for each component/product. A refrigerator is used as a case study to illustrate the methodology. This study addresses the uncertainty involved in identifying an EOL strategy for a specific product component during the design stage through the use of fuzzy logic. The method closes a gap in the current EOL strategy assessment criteria and introduces a comprehensive evaluation framework to capture multiple strategic perspectives by incorporating 14 key evaluation criteria.  相似文献   

15.
Environmental data quality improvement continues to focus on analytical laboratoryperformance with little, if any, attention given to improving the performance of field consultants responsible for sample collection. Many environmental professionals often assume that the primary opportunity for data error lies within the activities conducted by the laboratory. Experience in the evaluation of environmental data and project-wide quality assurance programs indicates that an often-ignored factor affecting environmental data quality is the manner in which a sample is acquired and handled in the field. If a sample is not properly collected, preserved, stored, and transported in the field, even the best laboratory practices and analytical methods cannot deliver accurate and reliable data (i.e., bad data in equals bad data out). Poor quality environmental data may result in inappropriate decisions regarding site characterization and remedial action. Field auditing is becoming an often-employed technique for examining the performance of the environmental sampling field team and how their performance may affect data quality. The field audits typically focus on: (1) verifying that field consultants adhere to project control documents (e.g., Work Plans and Standard Operating Procedures [SOPs]) during field operations; (2) providing third-party independent assurance that field procedures, quality assurance/ quality control (QA/QC)protocol, and field documentation are sufficient to produce data of satisfactory quality; (3) providing a defense in the event that field procedures are called into question; and (4) identifying ways to reduce sampling costs. Field audits are typically most effective when performed on a surprise basis; that is, the sampling contractor may be aware that a field audit will be conducted during some phase of sampling activities but is not informed of the specific day(s) that the audit will be conducted. The audit also should be conducted early on in the sampling program such that deficiencies noted during the audit can be addressed before the majority of field activities have been completed. A second audit should be performed as a follow-up to confirm that the recommended changes have been implemented. A field auditor is assigned to the project by matching, as closely as possible, the auditor's experience with the type of field activities being conducted. The auditor uses a project-specific field audit checklist developed from key information contained in project control documents. Completion of the extensive audit checklist during the audit focuses the auditor on evaluating each aspect of field activities being performed. Rather than examine field team performance after sampling, a field auditor can do so while the samples are being collected and can apply real-time corrective action as appropriate. As a result of field audits, responsible parties often observe vast improvements in their consultant's field procedures and, consequently, receive more reliable and representative field data at a lower cost. The cost savings and improved data quality that result from properly completed field audits make the field auditing process both cost-effective and functional.  相似文献   

16.
In the current paper, we model the duration of recovery of used products as a variable that depends on each unit’s quality. Because of the uncertainty related to returned units’ quality, the necessary time for the recovery of a lot is a random variable. We provide analytical expressions for the optimisation of recovery planning decisions under different assumptions regarding quality and demand characteristics. In addition, through an extensive numerical study, we examine the impact of the different parameters on the necessity to consider explicitly the stochastic nature of recovery lead-time. Moreover, we discuss the advisability of establishing procedures for the classification of returns according to their quality condition. As our findings indicate, overlooking quality uncertainty can increase related costs considerably because of poor process coordination. Furthermore, ignoring variability may result in undue overestimation of the efficiency of lot-sizing policies. On the other hand, the establishment of quality assessment procedures is worthwhile only when the stochastic behaviour of quality cannot be taken into account explicitly.  相似文献   

17.
One of the challenges facing professionals in the environmental arena today is the collection and assessment of large amounts of environmental analytical data. The assessment of the quality of that data is essential as multi-million dollar decisions for environmental site cleanups and/or long term monitoring efforts are made based on the analytical results. Also critical to environmental programs is the sharing and access of data across multiple data users. The ability to share data allows for better use of the limited resources available to clean up and monitor contaminated environmental sites. Standardization of electronic deliverables allows for collection of data from multiple data collectors into a single database for use by numerous data users and stakeholders on a project. This paper discusses the benefits of using a standard EDD deliverable format and use of environmental data assessment software tools to do project planning and data assessment throughout the duration of the environmental project.  相似文献   

18.
In this paper, we explore the potential for strategic environmental assessment (SEA) to be a useful tool for banks to manage environmental risks and inform lending decisions. SEA is an environmental assessment tool that was developed to assist strategic-level decision-makers, such as policy-makers, planners, government authorities and environmental practitioners in improving developmental outcomes, aiming to facilitate the transition to sustainable development. We propose that SEA may also be a valuable tool for banks because it has the capacity to provide information about environmental risks at a time when it can be used as an input to bank lending decisions, which can assist banks in making lending decisions with better environmental outcomes. For these reasons, we argue that in some circumstances, and particularly for project finance transactions, SEA may be a more useful environmental assessment tool for lenders than environmental impact assessment, which many banks are currently relying on to help assess and mitigate environmental risks. Furthermore, we suggest that the use of SEA by banks would contribute to the sustainability goals of SEA.  相似文献   

19.
The decision as to whether a contaminated site poses a threat to human health and should be cleaned up relies increasingly upon the use of risk assessment models. However, the more sophisticated risk assessment models become, the greater the concern with the uncertainty in, and thus the credibility of, risk assessment. In particular, when there are several equally plausible models, decision makers are confused by model uncertainty and perplexed as to which model should be chosen for making decisions objectively. When the correctness of different models is not easily judged after objective analysis has been conducted, the cost incurred during the processes of risk assessment has to be considered in order to make an efficient decision. In order to support an efficient and objective remediation decision, this study develops a methodology to cost the least required reduction of uncertainty and to use the cost measure in the selection of candidate models. The focus is on identifying the efforts involved in reducing the input uncertainty to the point at which the uncertainty would not hinder the decision in each equally plausible model. First, this methodology combines a nested Monte Carlo simulation, rank correlation coefficients, and explicit decision criteria to identify key uncertain inputs that would influence the decision in order to reduce input uncertainty. This methodology then calculates the cost of required reduction of input uncertainty in each model by convergence ratio, which measures the needed convergence level of each key input's spread. Finally, the most appropriate model can be selected based on the convergence ratio and cost. A case of a contaminated site is used to demonstrate the methodology.  相似文献   

20.
Implementation of a Quality Systems approach to making defensible environmental program decisions depends upon multiple, interrelated components. Often, these components are developed independently and implemented at various facility and program levels in an attempt to achieve consistency and cost savings. The U.S. Department of Energy, Office of Environmental Management (DOE-EM) focuses on three primary system components to achieve effective environmental data collection and use. (1) Quality System guidance, which establishes the management framework to plan, implement, and assess work performed; (2) A Standardized Statement of Work for analytical services, which defines data generation and reporting requirements consistent with user needs; and (3) A laboratory assessment program to evaluate adherence of work performed to defined needs, e.g., documentation and confidence. This paper describes how DOE-EM fulfills these requirements and realizes cost-savings through participation in interagency working groups and integration of system elements as they evolve.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号