共查询到20条相似文献,搜索用时 15 毫秒
1.
The Office for Analysis and Evaluation of Operational Data (AEOD) was established after the accident at Three Mile Island to improve the ways the US Nuclear Regulatory Commission (NRC) and the nuclear community use operating experience in identifying and resolving potential safety problems. One of the major missions of AEOD is to collect, screen, analyze and feed back operating experience to appropriate NRC offices, the nuclear community and the public. An important task within this mission is to assess the safety significance of numerous operating events and, for those determined to be significant, develop recommendations to eliminate the root causes of the event in order to prevent its recurrence. AEOD has developed and implemented a systematic framework for assessing the safety significance of operating events. 相似文献
2.
《影响评估与项目评价》2013,31(3):201-208
Ecological assessment forms a fundamental part of environmental impact assessment. The quality of ecological input in 15 Bahraini EIA reports concerning coastal and marine developments produced between 1996 and 2004 was evaluated using adopted criteria. The overall quality indicated that eight reports were assessed as borderline quality, and seven reports were found to be of poor quality. Major shortcomings included limited new ecological surveys, inadequate evaluation of impacts, neglecting cumulative and long-term impacts, and failing to address adequately mitigation and monitoring measures. 相似文献
3.
4.
I Rosenthal E D Weiler R L Keener P J Cumberland 《Quality Assurance: Good Practice, Regulation, and Law》1992,1(2):89-96
The Toxic Substances Control Act (TSCA) empowers the Environmental Protection Agency (EPA) to regulate risk associated with the use of existing chemicals and the introduction of new chemicals into commerce. Due to a number of concerns, however, the authority to regulate existing chemicals under TSCA has enjoyed limited success. A more generic and flexible approach is needed to achieve significant risk reduction for existing chemicals. This paper presents a frame-work for a generic approach to the regulation of existing chemicals. Under this framework, EPA would officially recognize that the distribution of chemical substances without evaluating and communicating to the user how to avoid operationally undesirable exposures represents an unreasonable risk to health or the environment. Acting under the authority of TSCA, EPA would then generically require suppliers to communicate acceptable exposure levels and information regarding safe use. This framework is consistent with the express policy of TSCA, which states that development of data with respect to the effects of chemical substances and mixtures on health and the environment should be the responsibility of manufacturers and processors of chemicals. The approach outlined here is consistent with and complements initiatives of the Office of Toxic Substances (OTS) and would enable OTS to accomplish some of the fundamental goals of TSCA. 相似文献
5.
Michael A Counte Shien Guo Teng-Fu Lin W T Workman James C Romeis 《Quality Assurance: Good Practice, Regulation, and Law》2005,11(2):85-102
The continued rapid worldwide diffusion of clinical hyperbaric facilities has substantially increased interest in clinical quality assessment and service improvement. This paper examines major issues, perspectives, and methods integral to the measurement and improvement of the quality of care provided to hyperbaric patients and their relevance and applicability across different societies. Special focus is directed toward the importance of quality assessment and improvement of clinical hyperbaric care, multiple stakeholder perspectives on improved clinical quality, measurement of clinical outcomes of hyperbaric care, importance of facility accreditation, process improvement methods, and the future importance of quality management in clinical hyperbaric facilities. 相似文献
6.
Scientometrics - With the emergence of Web 2.0, an online platform which encourages online creation of next generation tools, communication has become a nigh-indispensable tool for researchers.... 相似文献
7.
Supply chain managers are responsible for making decisions regarding supply chain risk in order to mitigate the impact of supply chain disruptions. This study develops and tests a theoretical model that leverages the individual-level knowledge-based view perspective to understand the process through which risk mitigation orientation of the supply chain manager contributes to his/her absorptive capacity. A supply chain manager’s absorptive capacity, in turn, enhances his/her ability to effectively mitigate supply chain risk. Study findings demonstrate that supply chain managers with high-risk mitigation orientation have greater level of absorptive capacity which enhances their risk mitigation competency. This study represents the first development and testing of a model that examines individual-level knowledge management factors that affect supply chain risk mitigation competency. This research emphasises the importance of the individual supply chain manager in managing risk and illustrates how theoretical perspectives from the knowledge management, supply chain risk and organisational behaviour literature can be fruitfully adopted to explain behaviour in the field of supply chain risk management. 相似文献
8.
Driving risk varies substantially among drivers. Identifying and predicting high-risk drivers will greatly benefit the development of proactive driver education programs and safety countermeasures. The objective of this study is twofold: (1) to identify factors associated with individual driver risk and (2) predict high-risk drivers using demographic, personality, and driving characteristic data. The 100-Car Naturalistic Driving Study was used for methodology development and application. A negative binomial regression model was adopted to identify significant risk factors. The results indicated that the driver's age, personality, and critical incident rate had significant impacts on crash and near-crash risk. For the second objective, drivers were classified into three risk groups based on crash and near-crash rate using a K-mean cluster method. The cluster analysis identified approximately 6% of drivers as high-risk drivers, with average crash and near-crash (CNC) rate of 3.95 per 1000 miles traveled, 12% of drivers as moderate-risk drivers (average CNC rate = 1.75), and 84% of drivers as low-risk drivers (average CNC rate = 0.39). Two logistic models were developed to predict the high- and moderate-risk drivers. Both models showed high predictive powers with area under the curve values of 0.938 and 0.930 for the receiver operating characteristic curves. This study concluded that crash and near-crash risk for individual drivers is associated with critical incident rate, demographic, and personality characteristics. Furthermore, the critical incident rate is an effective predictor for high-risk drivers. 相似文献
9.
Comparative risk assessment (CRA) is a systematic procedure for evaluating the environmental problems affecting a geographic area. This paper looks beyond the U.S. border and examines the experience with CRAs conducted in various developing countries and economies in transition, including Bangkok, Thailand, Cairo, Egypt and Quito, Ecuador, as well as other locations in Eastern Europe, Asia and Central and South America. A recent pilot CRA conducted in Taiwan is also considered. Comparisons are made of both the methodologies and the results across the relatively diverse international literature. The most robust finding is that conventional air pollutants (e.g., particulate matter and lead) consistently rank as high health risks across all of the CRAs examined. Given the varied nature of the settings studied in the CRAs, including level of economic development, urban-rural differences, and climate, this finding is particularly significant. Problems involving drinking water are also ranked as a high or medium health risk in almost all the countries studied. This is consistent with the results of analyses conducted by the World Bank suggesting contamination, limited coverage and erratic service by water supply systems.Beyond the major air pollutants and drinking water, the CRA results diverge significantly across countries. A number of problems involving toxic chemicals, e. g., hazardous air pollutants, rank as high health risks in the US but do not appear as consistent areas of concerns in the other countries studied. This likely reflects the so-called "risk transition" - the shift from sanitation and infection disease problems to those involving industry, vehicles and toxic substances - that often occurs with economic development. It may also reflect the greater information about sources of toxic pollutants in the U.S. For other problems, there are important differences across the developing countries and economies in transition. For example, hazardous and (industrial) non-hazardous waste issues ranked as medium or low health risks in all the countries studied, except for Taiwan where unmanaged toxic waste sites were considered to pose high risks. While the generally low ranking is consistent with the notion that few people are directly exposed to hazardous and (industrial) non-hazardous waste, it is not entirely surprising that views might be different in Taiwan, where space is so limited and population density is so high.We suggest that the wide range of findings likely reflect genuine differences among the countries studied. However, we cannot entirely rule out the possibility that some of the observed similarities (and differences) arise from the (relatively) common methodologies employed. 相似文献
10.
In this paper, humic acid (HA), known to play a large role in the binding and transport of pesticides in soil, was immobilized on a chromatographic support. Then, the association of some herbicides and rodenticides with the main soil component HA was examined using this novel chromatographic column. It appeared that HA has a lower affinity for neutral than for charged pesticides. Moreover, the influence of various parameters was investigated on the pesticide retention in order to providevaluable information about both the binding mechanism and the utilization conditions of the HA column. For all the pesticides studied, a change was clearly vizualized in the HA-pesticide association mechanism at a critical value of the Na+ concentration in the bulk solvent, x(c), equal to 0.6 M. Around this value, the HA structure balanced between a flexible linear conformation for x < x(c) and a random coil form for x > x(c). This work confirmed the conformation change on HA immobilized on silica. As well, only for the charged pesticides, it was clearly pointed out that below a Na+ concentration equal to 0.3 M, the pesticide binding to HA decreased when the salt concentration was enhanced due to an ion pair formation and a competition effect between the sodium cation and pesticide to bind to the HA molecule. Furthermore, it was established that the HA column was stable during an extended period of time, indicating that the HA column could soon become very attractive to determine the risk assessment of pesticides. 相似文献
11.
This paper summarises the main results of a European project BEQUAR (Benchmarking Exercise in Quantitative Area Risk Assessment in Central and Eastern European Countries). This project is among the first attempts to explore how independent evaluations of the same risk study associated with a certain chemical establishment could differ from each other and the consequent effects on the resulting area risk estimate. The exercise specifically aimed at exploring the manner and degree to which independent experts may disagree on the interpretation of quantitative risk assessments for the same entity. The project first compared the results of a number of independent expert evaluations of a quantitative risk assessment study for the same reference chemical establishment. This effort was then followed by a study of the impact of the different interpretations on the estimate of the overall risk on the area concerned. In order to improve the inter-comparability of the results, this exercise was conducted using a single tool for area risk assessment based on the ARIPAR methodology. The results of this study are expected to contribute to an improved understanding of the inspection criteria and practices used by the different national authorities responsible for the implementation of the Seveso II Directive in their countries. The activity was funded under the Enlargement and Integration Action of the Joint Research Centre (JRC), that aims at providing scientific and technological support for promoting integration of the New Member States and assisting the Candidate Countries on their way towards accession to the European Union. 相似文献
12.
Application and evaluation of an engineering data model 总被引:1,自引:1,他引:0
In a companion paper, we introduced an information model called EDM, for representing design and engineering information. EDM defines a small set of structures capable of depicting a wide range of semantics necessary for engineering design. These structures allow the definition of specific product models that are equivalent to database schemas, a fully instantiated structure is equivalent to an engineering or CAD database. EDM was developed in response to several criteria, among them the need to support changing technologies and evaluations and the need to support integrity checking. In this paper, EDM is applied to a small but complex example, a wall in building construction. Geometric, acoustic, and thermal properties are developed for the wall, defined in EDM structures. The example is then considered in terms of the evaluation criteria. 相似文献
13.
Eleonora Papadimitriou George Yannis Frits Bijleveld João L. Cardoso 《Accident; analysis and prevention》2013
The objective of this paper is the analysis of the state-of-the-art in risk indicators and exposure data for safety performance assessment in Europe, in terms of data availability, collection methodologies and use. More specifically, the concepts of exposure and risk are explored, as well as the theoretical properties of various exposure measures used in road safety research (e.g. vehicle- and person-kilometres of travel, vehicle fleet, road length, driver population, time spent in traffic, etc.). Moreover, the existing methods for collecting disaggregate exposure data for risk estimates at national level are presented and assessed, including survey methods (e.g. travel surveys, traffic counts) and databases (e.g. national registers). A detailed analysis of the availability and quality of existing risk exposure data is also carried out. More specifically, the results of a questionnaire survey in the European countries are presented, with detailed information on exposure measures available, their possible disaggregations (i.e. variables and values), their conformity to standard definitions and the characteristics of their national collection methods. Finally, the potential of international risk comparisons is investigated, mainly through the International Data Files with exposure data (e.g. Eurostat, IRTAD, ECMT, UNECE, IRF, etc.). The results of this review confirm that comparing risk rates at international level may be a complex task, as the availability and quality of exposure estimates in European countries varies significantly. The lack of a common framework for the collection and exploitation of exposure data limits significantly the comparability of the national data. On the other hand, the International Data Files containing exposure data provide useful statistics and estimates in a systematic way and are currently the only sources allowing international comparisons of road safety performance under certain conditions. 相似文献
14.
Progress and perspective of perfluorinated compound risk assessment and management in various countries and institutes 总被引:1,自引:0,他引:1
Yasuyuki Zushi Jonathan Nartey Hogarh Shigeki Masunaga 《Clean Technologies and Environmental Policy》2012,14(1):9-20
Perfluorooctane sulfonate (PFOS) and related compounds have recently been designated as target chemicals for regulation by
the Stockholm Convention on Persistent Organic Pollutants (POPs). Many countries have investigated and tried to implement
various countermeasures in response to this decision. In this article, we collect reports concerning regulations and risk
evaluations of perfluorinated compounds (PFCs) and review the current PFC management practiced in various countries. The first
part of this review contains a comprehensive collection of proposed standard PFC values, including provisional tolerable daily
intakes (pTDI), drinking water guidelines, and predicted non-effect concentrations (PNEC). The pTDI values ranged from 0.1
to 0.3 μg/kg/day for PFOS, and there are wide margins of safety for adults. Health risks for plant workers exposed to PFCs
and for infants are of particular concern. The application of these proposed values in controlling PFC pollution is one approach
that may effectively control human health risk without unduly sacrificing the benefits from PFC use. The second part of this
review contains a collection and review of a number of regulations and countermeasures, such as an EU directive, regulation
in Canada, and the Significant New Use Rule (SNUR), including voluntary control (i.e., production phase-out by 3M, stewardship
programs, regulation in the semiconductor industry). Most of these regulations are based principally on the precautionary
principle. However, they may not be as effective in pollution reduction as intended because the chemicals in question are
already widely distributed in the environment owing to their use and mobility in the environment. In addition, these types
of regulations would be non-operative in developing countries because rapidly growing economies place great demand on high
performance materials, including PFCs. Further development of risk assessment methods that allow the evaluation of the counter
risks of PFC alternatives and the loss of benefits from the PFC ban is necessary because of the possible continuous use of
PFCs, especially in developing countries. 相似文献
15.
In this paper we discuss Renn and Klinke's approach for risk evaluation and selection of risk management strategies. The main focus in the discussion is the foundational basis and the understanding of what risk is, and how a different foundational basis may simplify and improve the characterization of risk. We will present and discuss an alternative set of characteristics, and give some recommendations with respect to selection of risk management strategies based on different values or magnitudes of these characteristics. We believe that the main focus when describing and managing risk should be the potential consequences, represented by observable quantities, and the uncertainty related to their future values. 相似文献
16.
Material selection is a very fast growing multi-criteria decision-making (MCDM) problem involving a large number of factors influencing the selection process. Proper choice of material is a critical issue for the success and competitiveness of the manufacturing organizations in the global market. Selection of the most appropriate material for a particular engineering application is a time consuming and expensive process where several candidate materials available in the market are taken into consideration as the tentative alternatives. Although a large number of mathematical approaches is now available to evaluate, select and rank the alternative materials for a given engineering application, this paper explores the applicability and capability of two almost new MCDM methods, i.e. complex proportional assessment (COPRAS) and evaluation of mixed data (EVAMIX) methods for materials selection. These two methods are used to rank the alternative materials, for which several requirements are considered simultaneously. Two illustrative examples are cited which prove that these two MCDM methods can be effectively applied to solve the real time material selection problems. In each example, a list of all the possible choices from the best to the worst suitable materials is obtained which almost match with the rankings as derived by the past researchers. 相似文献
17.
Natural disasters and the challenge of extreme events: risk management from an insurance perspective
Smolka A 《Philosophical transactions. Series A, Mathematical, physical, and engineering sciences》2006,364(1845):2147-2165
Loss statistics for natural disasters demonstrate, also after correction for inflation, a dramatic increase of the loss burden since 1950. This increase is driven by a concentration of population and values in urban areas, the development of highly exposed coastal and valley regions, the complexity of modern societies and technologies and probably, also by the beginning consequences of global warming. This process will continue unless remedial action will be taken. Managing the risk from natural disasters starts with identification of the hazards. The next step is the evaluation of the risk, where risk is a function of hazard, exposed values or human lives and the vulnerability of the exposed objects. Probabilistic computer models have been developed for the proper assessment of risks since the late 1980s. The final steps are controlling and financing future losses. Natural disaster insurance plays a key role in this context, but also private parties and governments have to share a part of the risk. A main responsibility of governments is to formulate regulations for building construction and land use. The insurance sector and the state have to act together in order to create incentives for building and business owners to take loss prevention measures. A further challenge for the insurance sector is to transfer a portion of the risk to the capital markets, and to serve better the needs of the poor. Catastrophe bonds and microinsurance are the answer to such challenges. The mechanisms described above have been developed to cope with well-known disasters like earthquakes, windstorms and floods. They can be applied, in principle, also to less well investigated and less frequent extreme disasters: submarine slides, great volcanic eruptions, meteorite impacts and tsunamis which may arise from all these hazards. But there is an urgent need to improve the state of knowledge on these more exotic hazards in order to reduce the high uncertainty in actual risk evaluation to an acceptable level. Due to the rarity of such extreme events, specific risk prevention measures are hardly justified with exception of attempts to divert earth-orbit crossing meteorites from their dangerous path. For the industry it is particularly important to achieve full transparency as regards covered and non-covered risks and to define in a systematic manner the limits of insurability for super-disasters. 相似文献
18.
S J Wasson 《Quality Assurance: Good Practice, Regulation, and Law》1999,7(4):201-206
Establishing the credibility of existing data is an ongoing issue, particularly when the data sets are to be used for a secondary purpose, i.e., not the original reason for which they were collected. If the secondary purpose is similar to the primary purpose, the potential user may have little difficulty establishing credibility since the acceptance criteria for both purposes should be similar. If the secondary purpose is different, then data credibility may be more difficult to establish because the experiment generating the data may not have been conducted optimally for the secondary purpose and all of the necessary quality assurance data ("metadata") may not have been collected. In either case, a process will be required to determine the acceptability of the data. For this reason, at the time the U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program was established, similar certification and verification programs run by states or foreign countries routinely used existing data sets, for cost reasons, rather than generate new data by testing. The issue of whether existing data could be used in the ETV program immediately surfaced. In response, a policy and a process that addressed existing data were written and published in Appendix C of the ETV Quality and Management Plan (Hayes et al., 1998). This paper discusses how the ETV program determines the credibility of existing data used to verify the performance of environmental technologies. 相似文献
19.
Dynamic reliability: towards an integrated platform for probabilistic risk assessment 总被引:2,自引:0,他引:2
Dynamic reliability methods are powerful mathematical frameworks capable of handling interactions among components and process variables explicitly. In principle, they constitute a more realistic modeling of systems for the purposes of reliability, risk and safety analysis. Although there is a growing recognition in the risk community of the potentially greater correctness of these methods, no serious effort has been undertaken to utilize them in industrial applications.User-friendly tools would help foster usage of dynamic reliability methods in the industry. This paper defines the key components of such a platform and for each component, provides a detailed review of techniques available for their implementation. This paper attempts to provide milestones in the creation of a high level design of such tools. To achieve this purpose, a modular approach is used. For each part, various existing techniques are discussed with respect to their potential achievements. Issues related to expected future developments are also considered. 相似文献
20.
S Bouzdalkina R J Bath P D Greenlaw D Bottrell 《Quality Assurance: Good Practice, Regulation, and Law》2000,8(3-4):181-187
The quality evaluation and assessment of radiological data is the final step in the overall environmental data decisionprocess. This quality evaluation and assessment process is performed outside of the laboratory, and generally the radiochemist is not involved. However, with the laboratory quality management systems in place today, the data packages of radiochemical analyses are frequently much more complex than the project/program manager can effectively handle and additionally, with little involvement from radiochemists in this process, the potential for misinterpretation of radiological data is increasing. The quality evaluation and assessment of radiochemistry data consists of making three decisions for each sample and result, remembering that the laboratory reports all the data for each analyses as well as the uncertainty in each of these analyses. Therefore, at the data evaluation and assessment stage, the decisions are: (1) is the radionuclide of concern detected (each data point always has a number associated with it?); (2) is the uncertainty associated with the result greater than would normally be expected; and (3) if the laboratory rejected the analyses is there serious consequences to other samples in the same group? The need for the radiochemist's expertise for this process is clear. Quality evaluation and assessment requires the input of the radiochemist particularly in radiochemistry because of the lack of redundancy in the analytical data. This paper describes the role of the radiochemist in the quality assessment of radiochemical data for environmental decision making. 相似文献