首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Policy-making is usually about risk management. Thus, the handling of uncertainty in science is central to its support of sound policy-making. There is value in scientists engaging in a deep conversation with policy-makers and others, not merely 'delivering' results or analyses and then playing no further role. Communicating the policy relevance of different varieties of uncertainty, including imprecision, ambiguity, intractability and indeterminism, is an important part of this conversation. Uncertainty is handled better when scientists engage with policy-makers. Climate policy aims both to alter future risks (particularly via mitigation) and to take account of and respond to relevant remaining risks (via adaptation) in the complex causal chain that begins and ends with individuals. Policy-making profits from learning how to shift the distribution of risks towards less dangerous impacts, even if the probability of events remains uncertain. Immediate value lies not only in communicating how risks may change with time and how those risks may be changed by action, but also in projecting how our understanding of those risks may improve with time (via science) and how our ability to influence them may advance (via technology and policy design). Guidance on the most urgent places to gather information and realistic estimates of when to expect more informative answers is of immediate value, as are plausible estimates of the risk of delaying action. Risk assessment requires grappling with probability and ambiguity (uncertainty in the Knightian sense) and assessing the ethical, logical, philosophical and economic underpinnings of whether a target of '50 per cent chance of remaining under +2(°)C' is either 'right' or 'safe'. How do we better stimulate advances in the difficult analytical and philosophical questions while maintaining foundational scientific work advancing our understanding of the phenomena? And provide immediate help with decisions that must be made now?  相似文献   

2.
New approaches are developed that use measured data to adjust the analytical mass and stiffness matrices of a system so that the agreement between the analytical modes of vibration and the modal survey is improved. By adding known masses to the structure of interest, measuring the modes of vibration of this mass‐modified system, and finally using this set of new data in conjunction with the initial modal survey, the analytical mass matrix of the structure can be corrected, after which the analytical stiffness matrix can be readily updated. By manipulating the correction matrices into vector forms, the connectivity information can be enforced, thereby preserving the physical configuration of the system and reducing the sizes of the least‐squares problems that need to be solved. Solution techniques for updating the system matrices are introduced, and the numerical issues associated with solving overdetermined and underdetermined least squares problems are investigated. The effects of round‐off errors are also studied, and heuristic criteria are given for determining the minimum number of modes that need to be measured in order to ensure sufficiently accurate updated mass and stiffness matrices. Numerical experiments are presented to validate the proposed model‐updating techniques, to illustrate the effects of the number of measured modes on the quality of the updated model, to show how the magnitudes and locations of the added masses influence the updated matrices, and to highlight the numerical issues discussed in this paper. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

3.
What is research integrity? At the United States Environmental Protection Agency (U.S. EPA) research integrity can be defined as conducting and fostering research to define, anticipate, and understand environmental problems; and generating sound, appropriate, credible, and effective solutions to those problems. Whether in government, academia, or industry, integrity is required at all stages of research--from data generation to data analysis. What constitutes research integrity? Simply put, Did we do the right thing? Did we do it the right way? Did we honestly document what we did? This is especially important if the research is used as a basis for public policy. The extensive and intensive use of the results of science in EPA's standard setting, regulatory, and enforcement responsibilities means that scientific misconduct can lead to costly and inappropriate actions through unnecessary expenditure or inadequate protection. The soundness, effectiveness, and credibility of EPA's regulations ultimately rest on the scientific and technical bases for these actions. Careful attention to research record keeping can help ensure data quality and integrity. The U.S. Environmental Protection Agency, its research requirements, and the work of the National Health and Environmental Effects Research Laboratory are discussed below.  相似文献   

4.
This article proposes a simple strategy for establishing sensitivity requirements (quantitation limits) for environmental chemical analyses when the primary data quality objective is to determine if a contaminant of concern is greater or less than an action level (e.g., an environmental "cleanup goal," regulatory limit, or risk-based decision limit). The approach assumes that the contaminant concentrations are normally distributed with constant variance (i.e., the variance is not significantly dependent upon concentration near the action level). When the total or "field" portion of the measurement uncertainty can be estimated, the relative uncertainty at the laboratory's quantitation limit can be used to determine requirements for analytical sensitivity. If only the laboratory component of the total uncertainty is known, the approach can be used to identify analytical methods or laboratories that will not satisfy objectives for sensitivity (e.g., when selecting methodology during project planning).  相似文献   

5.
The quantification of a risk assessment model often requires the elicitation of expert judgments about quantities that cannot be precisely measured. The aims of the model being quantified provide important guidance as to the types of questions that should be asked of the experts. The uncertainties underlying a quantity may be classified as aleatory or epistemic according to the goals of the risk process. This paper discusses the nature of such a classification and how it affects the probability elicitation process and implementation of the resulting judgments. Examples from various areas of risk assessment are used to show the practical implications of how uncertainties are treated. An extended example from hazardous waste disposal is given.  相似文献   

6.
This article examines how members of the lay public factor risk perceptions, trust and technical information from differing scientific sources into policy judgements about potentially hazardous facilities. Focusing on radwaste storage repositories, we examine how members of the public filter new information about potential hazards through risk perceptions, and adjust their own beliefs about risks in light of that information. Scientists play a large (and increasing) role in public policy debates concerning nuclear waste issues, in which public perceptions of human health and environmental risks often differ substantially from scientific consensus about those risks. Public concerns and uncertainties are compounded when scientists from competing groups (government agencies, scientific institutions, industries, and interest groups) make different claims about the likely health and environmental consequences of different policy options. We show the processes by which the public receive and process scientific information about nuclear waste management risks using data taken from interviews with 1800 randomly selected individuals (1200 in New Mexico, and 600 nationwide). Among the more important findings are: (1) members of the public are able to make quite reasonable estimates about what kinds of positions on the risks of nuclear waste disposal will be taken by scientists from differing organizations (e.g. scientists from environmental groups, government agencies, or the nuclear industry); (2) in assessing the credibility of scientific claims, members of the public place great emphasis on the independence of the scientists from those who fund the research; and (3) prior expectations about the positions (or expected biases) of scientists from different organizations substantially affects the ways in which members of the public weigh (and utilize) information that comes from these scientists.  相似文献   

7.
Decision-making under uncertainty describes most environmental remediation and waste management problems. Inherent limitations in knowledge concerning contaminants, environmental fate and transport, remedies, and risks force decision-makers to select a course of action based on uncertain and incomplete information. Because uncertainties can be reduced by collecting additional data., uncertainty and sensitivity analysis techniques have received considerable attention. When costs associated with reducing uncertainty are considered in a decision problem, the objective changes; rather than determine what data to collect to reduce overall uncertainty, the goal is to determine what data to collect to best differentiate between possible courses of action or decision alternatives. Environmental restoration and waste management requires cost-effective methods for characterization and monitoring, and these methods must also satisfy regulatory requirements. Characterization and monitoring activities imply that, sooner or later, a decision must be made about collecting new field data. Limited fiscal resources for data collection should be committed only to those data that have the most impact on the decision at lowest possible cost.Applying influence diagrams in combination with data worth analysis produces a method which not only satisfies these requirements but also gives rise to an intuitive representation of complex structures not possible in the more traditional decision tree representation. This paper demonstrates the use of influence diagrams in data worth analysis by applying to a monitor-and-treat problem often encountered in environmental decision problems.  相似文献   

8.
Uncertainty is pervasive in economic policy-making. Modern economies share similarities with other complex systems in their unpredictability. But economic systems also differ from those in the natural sciences because outcomes are affected by the state of beliefs of the systems' participants. The dynamics of beliefs and how they interact with economic outcomes can be rich and unpredictable. This paper relates these ideas to the recent crisis, which has reminded us that we need a financial system that is resilient in the face of the unpredictable and extreme. It also highlights how such uncertainty puts a premium on sound communication strategies by policy-makers. This creates challenges in informing others about the uncertainties in the economy, and how policy is set in the face of those uncertainties. We show how the Bank of England tries to deal with some of these challenges in its communications about monetary policy.  相似文献   

9.
The technological risks associated with electricity generating options are a crucial consideration in the governance of energy strategies. Conversely, many central issues in the broader social debate over the governance of environmental risk (such as acid gas emissions, radioactive waste management, nuclear safety and global climate change) relate very strongly to technology choice in the electricity supply sector. The particularities of this field, therefore, offer a topical and pertinent case with which to explore the relationship between science and precaution in the governance of technological risk. By reference to the electricity sector, the present paper examines the contrasts between 'risk-based' and 'precautionary' approaches to the governance of risk, paying particular attention to the problems of intractable uncertainties and divergent values. A number of theoretical and methodological issues in conventional risk-assessment and cost-benefit analysis are examined and their practical implications for appraisal explored. Attention then turns to the form that might be taken by approaches to the governance of energy risks that are at the same time scientifically well-founded and precautionary. Conclusions are drawn for decision and policy making in this area.  相似文献   

10.
We present preliminary results of our joint investigations to monitor and mitigate environmental pollution, a leading contributor to chronic and deadly health disorders and diseases affecting millions of people each year. Using nanotechnology-based gas sensors; pollution is monitored at several ground stations. The sensor unit is portable, provides instantaneous ground pollution concentrations accurately, and can be readily deployed to disseminate real-time pollution data to a web server providing a topological overview of monitored locations. We are also employing remote sensing technologies with high-spatial and spectral resolution to model urban pollution using satellite images and image processing. One of the objectives of this investigation is to develop a unique capability to acquire, display and assimilate these valuable sources of data to accurately assess urban pollution by real-time monitoring using commercial sensors fabricated using nanofabrication technologies and satellite imagery. This integrated tool will be beneficial towards prediction processes to support public awareness and establish policy priorities for air quality in polluted areas. The complex nature of environmental pollution data mining requires computing technologies that integrate multiple sources and repositories of data over multiple networking systems and platforms that must be accurate, secure, and reliable. An evaluation of information security risks and strategies within an environmental information system is presented. In addition to air pollution, we explore the efficacy of nanostructured materials in the detection and remediation of water pollution. We present our results of sorption on advanced nanomaterials-based sorbents that have been found effective in the removal of cadmium and arsenic from water streams.  相似文献   

11.
This paper proposes a hybrid fuzzy-stochastic robust programming (FSRP) method and applies it to a case study of regional air quality management. As an extension of the existing fuzzy-robust programming and chance-constrained programming methods, FSRP can explicitly address complexities and uncertainties without unrealistic simplifications. Parameters in the FSRP model can be expressed as PDFs and/or membership functions, such that robustness of the optimization process can be enhanced. In its solution process, the FSRP model is converted to a deterministic version through transforming m imprecise constraints into 2 km precise inclusive constraints that correspond to k f -cut levels (under each given significance level). Results of the case study indicate that FSRP is applicable to problems that involve a variety of uncertainties. Air pollution control invariably involves a number of processes with socio-economic and environmental implications. These processes are associated with extensive uncertainties due to their complex, interactive, dynamic, and multiobjective features. Through the FSRP modeling study, useful solutions for planning regional air quality management practices have been generated. They reflect complex trade-offs between environmental and economic considerations. Willingness to pay higher operating costs will guarantee meeting environmental objectives; however, a desire to reduce the costs will run the risk of potentially violating the emission and/or ambient-air-quality standards.  相似文献   

12.
A methodology using probabilistic risk assessment techniques is proposed for evaluating the design of multiple confinement barriers for a fusion plant within the context of a limited allowable risk. The methodology was applied to the reference design of the International Thermonuclear Experimental Reactor (ITER). Accident sequence models were developed to determine the probability of radioactive releases from each confinement barrier. The current ITER design requirements, that set environmental radioactive release limits for individual event sequences grouped in categories by frequency, is extended to derive a limit on the plant overall risk. This avoids detailed accounting for event uncertainties in both frequency and consequence. Thus, an analytical form for a limit line is derived as a complementary cumulative frequency of permissible radioactive releases to the environment. The line can be derived using risk aversion of the designer's own choice. By comparing the releases from each confinement barrier against this limit line, a decision can be made about the number of barriers required to comply with the design requirements. A decision model using multi-attribute utility function theory was constructed to help the designer in choosing the type of the tokamak building while considering preferences for attributes such as construction cost, project completion time, technical feasibility and public attitude. Sensitivity analysis on some of the relevant parameters in the model was performed.  相似文献   

13.
The paper poses the question of whether the findings from social science research on risk perception could (or indeed should) find direct application in the domains of risk regulation and management. The problem this poses, of balancing and integrating the best available scientific judgements and evidence on the one hand with aspects of public risk evaluations on the other, is one of the most difficult questions to be faced by democratic governments and their regulators today. The paper argues that the findings from risk perception research do hold implications for the ways in which risk analysis and regulation should be done. Current social science research on risk perceptions is discussed together with existing UK regulatory policy, which allows, to a certain extent, contextual issues to be factored into risk tolerability decision making. The paper concludes by presenting a set of arguments both for and against the use of risk perceptions in policy. Brief conclusions are drawn regarding the conditions under which public preferences and values might be optimally elicited.  相似文献   

14.
Queue time constraints are commonly imposed to ensure product quality in contemporary production systems. We study the performance of two single stations with deterministic service times and a predetermined time window in between, where both stations suffer time-based pre-emptive breakdowns. To improve productivity, achieving higher capacity and lower rework rate are the two main objectives. While higher capacity requires a higher work-in-process (WIP)-level threshold, a lower rework rate requires a smaller one. To quantify the trade-off between the two objectives, an analytical model is derived. The model is then used to derive the WIP-level threshold control policy for a time-constrained system. We also show that system capacity diminishes with the decrease in WIP-level thresholds.  相似文献   

15.
《Technology in Society》1999,21(2):121-133
Prediction in traditional, reductionist natural science serves the role of validating hypotheses about invariant natural phenomena. In recent years, a new type of prediction has arisen in science, motivated in part by the needs of policy makers and the availability of new technologies. This new predictive science seeks to foretell the behavior of complex environmental phenomena such as climate change, earthquakes, and extreme weather events. Significant intellectual and financial resources are now devoted to such efforts, in the expectation that predictions will guide policy making. These expectations, however, derive in part from confusion about the different roles of prediction in science and society. Policy makers lack a framework for assessing when and if prediction can help achieve policy goals. This article is a first step towards developing such a framework.  相似文献   

16.
  • This study argues that the government-relationship building efforts by foreign invested enterprises (FIEs) depend on the perceived level of regulatory uncertainties, which, in turn, is conditioned by the institutional distances between their home and host countries.
  • The regulatory antecedents (regulatory complexity and enforcement uncertainty) to government-relationship building by foreign-invested enterprises and the moderating effects of institutional distance (regulative and cultural distances) in the context of the large transition economy of China are examined using a sample of 424 foreign-invested enterprises.
  • The results show that they tend to actively engage in government-relationship building when regulatory uncertainties (complexity and enforcement uncertainties) are high. The moderating analyses reveal the strengthening effects of regulative distances on the relationship between regulatory uncertainties and government-relationship building and the mixed effects of cultural distance.
  相似文献   

17.
Random uncertainties in finite element models in linear structural dynamics are usually modeled by using parametric models. This means that: (1) the uncertain local parameters occurring in the global mass, damping and stiffness matrices of the finite element model have to be identified; (2) appropriate probabilistic models of these uncertain parameters have to be constructed; and (3) functions mapping the domains of uncertain parameters into the global mass, damping and stiffness matrices have to be constructed. In the low-frequency range, a reduced matrix model can then be constructed using the generalized coordinates associated with the structural modes corresponding to the lowest eigenfrequencies. In this paper we propose an approach for constructing a random uncertainties model of the generalized mass, damping and stiffness matrices. This nonparametric model does not require identifying the uncertain local parameters and consequently, obviates construction of functions that map the domains of uncertain local parameters into the generalized mass, damping and stiffness matrices. This nonparametric model of random uncertainties is based on direct construction of a probabilistic model of the generalized mass, damping and stiffness matrices, which uses only the available information constituted of the mean value of the generalized mass, damping and stiffness matrices. This paper describes the explicit construction of the theory of such a nonparametric model.  相似文献   

18.
A Helmholtz free energy equation of state for the fluid phase of isobutane (R-600a) has been developed on the basis of the ITS-90 temperature scale. This model was developed using selected measurements of the pressure–density–temperature (P, , T), isobaric heat capacity, speed of sound, and saturation properties. The structure of the present model consists of only 19 terms in its functional form, which is the same as those successfully applied to our recent modeling of R-290 and R-600, and a nonlinear fitting procedure was used to determine the numerical parameters of the present equation of state. Based on a comparison with available experimental data, it is recognized that the model represents most of the reliable experimental data accurately in the range of validity covering temperatures from 113.56 K (the triple-point temperature) to 573 K, at pressures up to 35 MPa, and at densities up to 749 kg·m–3. Physically sound behavior of the derived thermodynamic properties over the entire fluid phase is demonstrated. The estimated uncertainties of properties calculated using the model are 0.2% in density, 1% in heat capacities, 0.02% in the speed of sound for the vapor, 1% in the speed of sound elsewhere, and 0.2% in vapor pressure, except in the critical region. In addition, graphical and statistical comparisons between experimental data and the available thermodynamic models, including the present one, showed that the model can provide a physically sound representation of all the thermodynamic properties of engineering importance.  相似文献   

19.
The mantra that policy and management should be 'evidence-based' is well established. Less so are the implications that follow from 'evidence' being predictions of the future (forecasts, scenarios, horizons) even though such futures define the actions taken today to make the future sustainable. Here, we consider the tension between 'evidence', reliable because it is observed, and predictions of the future, unobservable in conventional terms. For flood risk management in England and Wales, we show that futures are actively constituted, and so imagined, through 'suites of practices' entwining policy, management and scientific analysis. Management has to constrain analysis because of the many ways in which flood futures can be constructed, but also because of commitment to an accounting calculus, which requires risk to be expressed in monetary terms. It is grounded in numerical simulation, undertaken by scientific consultants who follow policy/management guidelines that define the futures to be considered. Historical evidence is needed to deal with process and parameter uncertainties and the futures imagined are tied to pasts experienced. Reliance on past events is a challenge for prediction, given changing probability (e.g. climate change) and consequence (e.g. development on floodplains). So, risk management allows some elements of risk analysis to become unstable (notably in relation to climate change) but forces others to remain stable (e.g. invoking regulation to prevent inappropriate floodplain development). We conclude that the assumed separation of risk assessment and management is false because the risk calculation has to be defined by management. Making this process accountable requires openness about the procedures that make flood risk analysis more (or less) reliable to those we entrust to produce and act upon them such that, unlike the 'pseudosciences', they can be put to the test of public interrogation by those who have to live with their consequences.  相似文献   

20.
Researchers have stressed that manufacturing system flexibility research requires a quantitative model allowing a manufacturing system to prioritize its flexibility dimension and promote the performance of the manufacturing system. A quantification model presented in the present research is demonstrated to assess the degree of environmental uncertainty and illustrates a method for delivering the requirement of flexibility improvement for the manufacturing system so that the company is able to prioritize the types of manufacturing flexibility which a manufacturing system requires in an uncertain environment. Quantitative approaches including quality function deployment (QFD), analytical hierarchy process (AHP), and grey relational analysis (GRA) have been employed to find a means for improving the flexibility of a manufacturing system to cope with environmental uncertainty. QFD is the focal approach for the deployment of the integrated structure of the research. AHP is applied to explore the relative weighted importance of environmental uncertainty factors, while GRA is used to find out the relationships between manufacturing flexibility and environmental uncertainty. A combination of these approaches reveals a useful tool for managers to prioritize the types of flexibility which a manufacturing system requires for coping with an uncertain environment. In particular, the present research studied the manufacturing flexibility requirements of a food company in Taiwan.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号