首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The standard equivalent linearization procedure for estimating the mean and variance of the response of nonlinear dynamic systems has proved to be an unusually effective technique. For over forty years there has been general agreement about the procedure to be followed. Recently two independent claims have been made that the standard procedure harbors a subtle flaw. In place of the standard procedure, essentially the same alternative procedure was claimed to be the “correct” procedure, even though, in the test cases investigated, the alternative “correct” procedure produced estimates with greater errors than the “incorrect” standard procedure. The present note investigates the claim that the standard procedure is flawed and finds that: (a) there is no subtle flaw in the standard procedure; (b) the proposed alternative procedure differs from the standard procedure in that it employs a different criterion for selecting the optimum linear approximation; (c) there is also no flaw in the proposed alternative procedure; but, (d) there does not seem to be any practical advantage to using the proposed alternative, since the standard procedure is simpler and more accurate.  相似文献   

2.
Storybuilder—A tool for the analysis of accident reports   总被引:1,自引:1,他引:0  
As part of an ongoing effort by the ministry of Social Affairs and Employment of The Netherlands a research project is being undertaken to construct a causal model for the most commonly occurring scenarios related to occupational risk. This model should provide quantitative insight in the causes and consequences of occupational accidents. The results should be used to help selecting optimal strategies to reduce these risks taking the costs of accidents and of measures into account. The research is undertaken by an international consortium under the name of Workgroup Occupational Risk Model. One of the components of the model is a tool to systematically classify and analyse past accidents. This tool: “Storybuilder” and its place in the Occupational Risk Model (ORM) are described in the paper.The paper gives some illustrations of the application of the Storybuilder, drawn from the study of ladder accidents, which forms one of the biggest single accident categories in the Dutch data.  相似文献   

3.
A post-processing technique for determining relative system sensitivity to groups of parameters and system components is presented. It is assumed that an appropriate parametric model is used to simulate system behavior using Monte Carlo techniques and that a set of realizations of system output(s) is available. The objective of our technique is to analyze the input vectors and the corresponding output vectors (that is, post-process the results) to estimate the relative sensitivity of the output to input parameters (taken singly and as a group) and thereby rank them. This technique is different from the design of experimental techniques in that a partitioning of the parameter space is not required before the simulation. A tree structure (which looks similar to an event tree) is developed to better explain the technique. Each limb of the tree represents a particular combination of parameters or a combination of system components. For convenience and to distinguish it from the event tree, we call it the parameter tree.To construct the parameter tree, the samples of input parameter values are treated as either a “+” or a “−” based on whether or not the sampled parameter value is greater than or less than a specified branching criterion (e.g., mean, median, percentile of the population). The corresponding system outputs are also segregated into similar bins. Partitioning the first parameter into a “+” or a “−” bin creates the first level of the tree containing two branches. At the next level, realizations associated with each first-level branch are further partitioned into two bins using the branching criteria on the second parameter and so on until the tree is fully populated. Relative sensitivities are then inferred from the number of samples associated with each branch of the tree.The parameter tree approach is illustrated by applying it to a number of preliminary simulations of the proposed high-level radioactive waste repository at Yucca Mountain, NV. Using a Total System Performance Assessment Code called TPA, realizations are obtained and analyzed. In the examples presented, groups of five important parameters, one for each level of the tree, are used to identify branches of the tree and construct the bins. In the first example, the five important parameters are selected by more traditional sensitivity analysis techniques. This example shows that relatively few branches of the tree dominate system performance. In another example, the same realizations are used but the most important five-parameter set is determined in a stepwise manner (using the parameter tree technique) and it is found that these five parameters do not match the five of the first example. This important result shows that sensitivities based on individual parameters (i.e. one parameter at a time) may differ from sensitivities estimated based on joint sets of parameters (i.e. two or more parameters at a time).The technique is extended using subsystem outputs to define the branches of the tree. The subsystem outputs used in this example are the total cumulative radionuclide release (TCR) from the engineered barriers, unsaturated zone, and saturated zone over 10,000 yr. The technique is found to be successful in estimating the relative influence of each of these three subsystems on the overall system behavior.  相似文献   

4.
The World Health Organisation (WHO) estimate that road traffic accidents represent the third leading cause of “death and disease” worldwide. A number of countries have, therefore, launched safety campaigns that have reduced their fatalities. In almost every case, however, this reduction has not been matched by a fall in the total frequency of road traffic accidents. Low-severity incidents remain a significant problem. “Attribution error” provides one plausible explanation for this phenomenon. Most drivers believe that they are less likely to be involved in an accident than other motorists. Existing road safety campaigns do little to address this problem; they focus on national and regional statistics that often seem remote from the local experiences of road users. This paper, therefore, describes the design and development of a system to provide the general public with access to information on the location and circumstances of road accidents in a Scottish city. The closing sections describe the initial results from a psychometric study that is intended to determine whether the information provided by such an application will have any impact on individual risk perception.  相似文献   

5.
Recent works [Epstein S, Rauzy A. Can we trust PRA? Reliab Eng Syst Safety 2005; 88:195–205] have questioned the validity of traditional fault tree/event tree (FTET) representation of probabilistic risk assessment problems. In spite of whether the risk model is solved through FTET or binary decision diagrams (BDDs), importance measures need to be calculated to provide risk managers with information on the risk/safety significance of system structures and components (SSCs). In this work, we discuss the computation of the Fussel–Vesely (FV), criticality, Birnbaum, risk achievement worth (RAW) and differential importance measure (DIM) for individual basic events, basic event groups and components. For individual basic events, we show that these importance measures are linked by simple relations and that this enables to compute basic event DIMs both for FTET and BDD codes without additional model runs. We then investigate whether/how importance measures can be extended to basic event groups and components. Findings show that the estimation of a group Birnbaum or criticality importance is not possible. On the other hand, we show that the DIM of a group or of a component is exactly equal to the sum of the DIMs of the corresponding basic events and can therefore be found with no additional model runs. The above findings hold for both the FTET and the BDD methods.  相似文献   

6.
Black Tournai “marble”, a fine-grained Lower Carboniferous (Tournaisian) limestone able to take a good polish has been widely used in the Flanders region (Belgium). Highly crafted baptismal fonts and tombslabs were also exported to England, France and elsewhere during the Middle Ages. Such objects are particularly valuable since their distribution aids the dating of historical events and the reconstruction of medieval trade. Similar black “marble” was extracted in the Meuse valley (Belgium) in the Middle Ages, and there are exploited sources in the UK, Ireland and elsewhere. Thus, it is not straightforward to determine the provenance of black “marble”. Based on geological, stylistic and historical evidence, this paper shows the likelihood that a black “marble” tombslab found in Nidaros Cathedral in Trondheim (Central Norway) was extracted and crafted in Tournai and shipped northwards around 1160, possibly for the grave of the first Norwegian archbishop, Jon Birgerson. The tombslab represents the first known crafted stone imported to Norway from the European continent/British Isles and is thus unique in a historical context. The properties of the Trondheim tombslab match those of black Tournai “marble”: It is a silicified, bioclastic packstone loaded with crinoids, featuring bryozoa and fragments of brachiopods and ostracods. The high silica content and absence of foraminifers distinguish the stone from the Viséan black “marble” quarried in the Meuse valley.  相似文献   

7.
The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as “active failures of operational personnel” under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors.A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability.  相似文献   

8.
The starting point for this paper is a traditional approach to maintenance optimization where an object function is used for optimizing maintenance intervals. The object function reflects maintenance cost, cost of loss of production/services, as well as safety costs, and is based on a classical cost–benefit analysis approach where a value of prevented fatality (VPF) is used to weight the importance of safety. However, the rationale for such an approach could be questioned. What is the meaning of such a VPF figure, and is it sufficient to reflect the importance of safety by calculating the expected fatality loss VPF and potential loss of lives (PLL) as being done in the cost–benefit analyses? Should the VPF be the same number for all type of accidents, or should it be increased in case of multiple fatality accidents to reflect gross accident aversion?In this paper, these issues are discussed. We conclude that we have to see beyond the expected values in situations with high safety impacts. A framework is presented which opens up for a broader decision basis, covering considerations on the potential for gross accidents, the type of uncertainties and lack of knowledge of important risk influencing factors. Decisions with a high safety impact are moved from the maintenance department to the “Safety Board” for a broader discussion. In this way, we avoid that the object function is used in a mechanical way to optimize the maintenance and important safety-related decisions are made implicit and outside the normal arena for safety decisions, e.g. outside the traditional “Safety Board”.A case study from the Norwegian railways is used to illustrate the discussions.  相似文献   

9.
General equations and numerical tables are developed for quantification of the probabilities of sequentially dependent repeatable human errors. Such errors are typically associated with testing, maintenance or calibration (called “pre-accident” or “pre-initiator” tasks) of redundant safety systems. Guidance is presented for incorporating dependent events in large system fault tree analysis using implicit or explicit methods. Exact relationships between these methods as well as numerical tables and simple approximate methods for system analysis are described. Analytical results are presented for a general human error model while the numerical tables are valid for a specific Handbook (THERP) model. Relationships are pointed out with earlier methods and guides proposed for error probability quantification.  相似文献   

10.
A lamination technique for liquid crystal polymer (LCP)/Cu was developed for high speed and high performance printed circuit boards (PCB). This approach was accomplished by using a modified surface activated bonding (SAB) process to achieve enhanced adhesion and a smooth interface. Systematic investigation of peel strength of four categories of samples, namely “as bonded”, “annealed”, “Cu-deposited”, and “Cu-deposited and annealed” showed highest peel strength in the “Cu-deposited and annealed” sample. Significant improvements in adhesion were observed in the samples cleaned with argon-radio frequency (Ar-rf) plasma (“as bonded” samples) followed by Cu deposition on LCP, which were heated after bonding in low vacuum pressure at 240∘C (about 70–75 times higher than that of “as bonded”). XPS analyses on peeled surfaces of the “Cu-deposited and annealed” sample reveal bulk fracture in the LCP. Threefold lower loss in conduction of SAB processed laminate than that of conventional heat laminate was most likely due to smooth interface of the SAB processed laminate (surface roughness was ninefold lower than that of conventional heat laminate). A plausible adhesion mechanism of Cu/LCP might be due to bonding of Cu adhesion sites to plasma induced dangling sites of LCP surface, and thermal reconstruction of Cu deposited layers.  相似文献   

11.
Robert B.   《Technology in Society》2003,25(4):513-516
Three tasks must be included when considering the broad topic of urban security. The first is to define the term “critical infrastructure.” Second, security must be viewed from a systems perspective when looking at cities and the infrastructure that serves them. Third, careful scrutiny must be given to heretofore not-considered vulnerabilities that exist in every major city.In the hours and days immediately following the attacks on September 11, everything from foot bridges to tall buildings were considered to be critical infrastructure. But, clearly, not everything in such a broad definition can be defended. So then, what is today’s definition of critical infrastructure? One might be a new version of the “3 R’s”—resist, respond, recover. In those terms, “critical infrastructure” could be defined as: (a) systems whose rapid failure would lead to a catastrophic loss of life; (b) systems whose failure or significant degradation would lead to unacceptable economic consequences; (c) systems whose rapid failure would significantly impact rescue and response efforts; and (d) systems whose significant degradation severely impact recovery efforts.Resist? It would be impossible for a city to resist everything, everywhere. The ability to respond to some events would require efforts that are above and beyond the realistic capability of any city. That moves the scenario to recovery and rebuilding.  相似文献   

12.
The freezing process is widely used in the food industry. In the 70s, French regulation authorities have created in collaboration with the food industry the concept of «surgélation» process with the objective of improving the image of high quality frozen foods. The process of “surgélation” which could be translated as “super freezing” corresponds to a freezing process for which a final temperature of −18 °C must be reached “as fast as possible”. This concept was proposed in opposition to a conventionally “freezing” process for which no specific freezing rate is expected and the final storage temperature can be of −12 °C only. The objective of this work is to propose a methodology to evaluate the mean amount of frozen ice in a complex food as a function of temperature and to deduce a target temperature that must be considered as the temperature for which the food may be considered as “frozen”. Based on the definition proposed by the IIF-IIR red book, this target temperature has been defined as the temperature for which 80% of the freezable water is frozen. A case study is proposed with a model food made of two constituents.  相似文献   

13.
The expression, “ethics of family planning,” it is argued, has no firm meaning, and should not be taken to imply that a full set of moral rules and principles governing family planning has been or is likely to be established. A survey is made of recent views on population and economic and social development, and it is argued that, although there is, indeed, no “universal problem” of population, the optimistic — as well as the pessimistic — view of this relationship is open to doubt. It is further argued that “ethics” cannot be imposed on subject matter of population from without: The very identification of a “problem” of population is evaluative from the start. A scheme of analysis to appraise the ethical status of measures to arrest or promote population growth is proposed, and a number of such measures are critically analyzed.  相似文献   

14.
Dr. Ted Trainer's paper in this issue contends that “de-materialisation” (decreasing energy and material inputs per unit of output) is a “myth” that must now be dropped from arguments against the “limits to growth” thesis. His specific arguments against de-materialisation are questioned in this commentary. This paper goes on to argue that even if de-materialisation has not taken place, it does not follow that near-term “zero growth” becomes necessary. On the contrary, the “limits to growth” position rests on erroneous Malthusian projections, and if the scarcity and spillover effects of growth are appropriately priced, conservation and substitution will be induced. Economic growth will facilitate technological and economic solutions to pollution and depletion. Institutional arrangements that will structure incentives, such as making better use of markets to set appropriate prices, are at the heart of the sustainability problem.  相似文献   

15.
The cost of injuries and “accidents” to an organisation is very important in establishing how much it should spend on safety control. Despite the usefulness of information about the cost of a company's accidents, it is not customary accounting practice to make these data available. Of the two kinds of costs incurred by a company through occupational injuries and accidents, direct costs and indirect costs; the direct costs are much easier to estimate. However, the uninsured costs are usually more critical and should be estimated by each company. The authors investigate a general model to estimate the above costs and hence to establish efficient safety control. One construction company has been a pilot for this study. By analysing actual company data for three years, it is found that the efficient safety control cost should be 1.2–1.3% of total contract costs.  相似文献   

16.
Recently, several manufacturers of domestic refrigerators have introduced models with “quick thaw” and “quick freeze” capabilities. In this study, the time required for freezing and thawing different meat products was determined for five different models of household refrigerators. Two refrigerators had “quick thaw” compartments and three refrigerators had “quick freeze” capabilities. It was found that some refrigerator models froze and thawed foods significantly faster than others (P<0.05). The refrigerators with the fastest freezing and thawing times were found to be those with “quick thaw” and “quick freeze” capabilities. Heat transfer coefficients ranged from 8 to 15 Wm−2K−1 during freezing, and the overall heat transfer coefficients ranged from 5 to 7 Wm−2 K−1 during thawing. Mathematical predictions for freezing and thawing time in the refrigerators gave results similar to those obtained in experiments. With the results described, manufacturers can improve their design of refrigerators with quick thawing and freezing functions.  相似文献   

17.
The author argues that the United States has paid insufficient attention in recent years to the relationship between its overall economic well-being and technology. He states that a major goal of the Clinton Administration, and the objective of the Technology Administration in the Department of Commerce, is to work with industry to address these issues. The ways in which this is being done are examined, and the issues involved in answering the questions: “Where are we?,” “Where are we going?,” and “How do we get there?” are discussed.  相似文献   

18.
Design seismic forces depend on the peak ground acceleration (PGA) and on the shape of design spectrum curves dictated in building codes. At present there is no doubt that it is necessary to construct so-called “site and region-specific” design input ground motions reflecting influence from different magnitude events at different distances that may occur during a specified time period. A unified approach to ground motion parameters estimation is described. A collection of ground motion recordings of small to moderate (3.0–3.5≤ML≤6.5) earthquakes obtained during the execution of the Taiwan Strong Motion Instrumentation Program (TSMIP) since 1991 was used to study source scaling model, attenuation relations and site effects in Taiwan region. A stochastic simulation technique was applied to predict PGA and response spectra for the Taipei basin. “Site and region-dependent” uniform hazard response spectra were estimated for various geological conditions in the Taipei basin using a technique of probabilistic seismic hazard analysis.  相似文献   

19.
Danish studies of traffic accidents at priority intersections have shown a particular type of accidents. In these accidents a car driver supposed to give way has collided with a bicycle rider on the priority road. Often the involved car drivers have maintained that they did not see the bicycle until immediately before the collision even though the bicycle must have been clearly visible.

Similar types of accidents have been the subject of studies elsewhere. In literature they are labelled “looked-but-failed-to-see”, because it seems clear that in many cases the car drivers have actually been looking in the direction where the other parties were but have not seen (i.e. perceived the presence of) the other road user. This paper describes two studies approaching this problem.

One study is based on 10 self-reported near accidents. It does show that “looked-but-failed-to-see” events do occur, especially for well experienced drivers. The other study based on Gap Acceptance shows that the car driver acceptance of gaps towards cyclists depends on whether or not another car is present. Hypotheses for driver perception and for accident countermeasures are discussed.  相似文献   


20.
Organizations that design and/or operate complex systems have to make trade-offs between multiple, interacting, and sometimes conflicting goals at both the individual and organizational levels. Identifying, communicating, and resolving the conflict or tension between multiple organizational goals is challenging. Furthermore, maintaining an appropriate level of safety in such complex environments is difficult for a number of reasons discussed in this paper. The objective of this paper is to propose a set of related concepts that can help conceptualize organizational risk and help managers to understand the implications of various performance and resource pressures and make appropriate trade-offs between efficiency and thoroughness that maintain system safety. The concepts here introduced include (1) the thoroughness–efficiency space for classifying organizational behavior, and the various resource/performance and regulatory pressures that can displace organizations from one quadrant to another within this space, (2) the thoroughness–efficiency barrier and safety threshold, and (3) the efficiency penalty that organizations should accept, and not trade against organizational thoroughness, in order to maintain safety. Unfortunately, many accidents share a conceptual sameness in the way they occur. That sameness can be related to the dynamics conceptualized in this paper and the violation of the safety threshold. This sameness is the sad story of the Bhopal accident, the Piper Alpha accident, and score of others. Finally, we highlight the importance of a positive safety culture as an essential complement to regulatory pressure in maintaining safety. We illustrate the “slippery slope of thoroughness” along which organizational behavior slides under the influence of performance pressure, and suggest that a positive safety culture can be conceived of as “pulling this slippery slope” up and preventing the violation of the safety threshold.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号