首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 375 毫秒
1.
Careful accident investigation provides opportunities to review safety arrangements in socio-technical systems. There is consensus that human intervention is involved in the majority of accidents. Ever cautious of the consequences attributed to such a claim vis-à-vis the apportionment of blame, several authors have highlighted the importance of investigating organizational factors in this respect. Specific regulations to limit what were perceived as unsuitable organizational influences in shipping operations were adopted by the International Maritime Organization (IMO). Guidance is provided for the investigation of human and organizational factors involved in maritime accidents. This paper presents a review of 41 accident investigation reports related to machinery space fires and explosions. The objective was to find out if organizational factors are identified during maritime accident investigations. An adapted version of the Human Factor Analysis and Classification System (HFACS) with minor modifications related to machinery space features was used for this review. The results of the review show that organizational factors were not identified by maritime accident investigators to the extent expected had the IMO guidelines been observed. Instead, contributing factors at the lower end of organizational echelons are over-represented.  相似文献   

2.
A new measure for supplier performance evaluation   总被引:4,自引:0,他引:4  
Recently the concept of dimensional analysis was used to propose a supplier performance measure and to obtain an index called the VPI (Vendor Performance Index). Usually the performance criteria used in supplier performance evaluation include quantitative and qualitative criteria. Here a new supplier performance measure is proposed as an alternative to the VPI. For qualitative criteria, a two-directional consideration is used instead of a one-directional approach, which results in only a single score. The fuzzy bag method is used to compensate for blindness in human judgement. Then all scores for quantitative and qualitative criteria are combined in an intuitive sum of weighted averages called the SUR. The SUR is illustrated and compared with the VPI by means of two examples.  相似文献   

3.
Auger electron spectroscopy (AES) has rapidly developed from a purely research oriented technique into an extremely versatile analytical method with unique features for qualitative and quantitative materials characterization. The low escape depth (5–10 Å) of the emitted Auger electrons makes it ideal for surface analysis and for depth profile impurity distribution analysis when combined with in situ ion sputtering.Both surface and depth profile analysis can be accomplished on a selected area by the use of optical or primary electron beam scanning techniques. Results are presented which illustrate the significance of the analytical capabilities of AES in the qualitative and quantitative analysis of thin film electronic materials used in the fabrication of precision thin film resistors, capacitors and conductors.  相似文献   

4.
In the late 1980's, amidst the qualitative and quantitative validation of certain Human Reliability Assessment (HRA) techniques, there was a desire for a new technique specifically for a nuclear reprocessing plant being designed. The technique was to have the following attributes: it should be data-based rather than involving pure expert judgement; it was to be flexible, so that it would allow both relatively rapid screening and more detailed assessment; and it was to have sensitivity analysis possibilities, so that Human Factors design-related parameters, albeit at a gross level, could be brought into the risk assessment equation.The techniques and literature were surveyed, and it was decided that no one technique fulfilled these requirements, and so a new approach was developed. Two techniques were devised, the Human Reliability Management System (HRMS), and the Justification of Human Error Data Information (JHEDI) technique, the latter being essentially a quicker screening version of the former. Both techniques carry out task analysis, error analysis, and Performance Shaping Factor-based quantification, but JHEDI involves less detailed assessment than HRMS. Additionally, HRMS can be utilised to determine error reduction mechanisms, based on the way the Performance Shaping Factors are contributing to the assessed error probabilities. Both techniques are fully computerised and assessments are highly documentable and auditable, which was seen as a useful feature both by the company developing the techniques, and by the regulatory authorities assessing the final output risk assessments into which these two techniques fed data. This paper focuses in particular on the quantification process used by these techniques.The quantification approach for both techniques was principally one of extrapolation from real data to the desired Human Error Probability (HEP), based on a comparison between Performance Shaping Factor (PSF) profiles for the real, and the to-be-assessed, task or scenario. The extrapolation process therefore required a set of PSF profiles for a number of real and representative data-based scenarios, and empirically-derived rules to extrapolate to new but similar scenarios. Using existing nuclear chemical plant human error and other data, a PSF profiling and extrapolation system was developed, which could assess most HEPs required for nuclear chemical risk assessments. The two techniques were then employed in a major risk assessment, with HRMS being utilised for approximately twenty high risk scenarios, and JHEDI being used to calculate well over five hundred HEPs for a large range of tasks and scenarios, by a number of assessors.  相似文献   

5.
Platelet adhesion and activation rates are frequently used to assess the thrombogenicity of biomaterials, which is a crucial step for the development of blood-contacting devices. Until now, electron and confocal microscopes have been used to investigate platelet activation but they failed to characterize this activation quantitatively and in real time. In order to overcome these limitations, quartz crystal microbalance with dissipation (QCM-D) was employed and an explicit time scale introduced in the dissipation versus frequency plots (Df–t) provided us with quantitative data at different stages of platelet activation. The QCM-D chips were coated with thrombogenic and non-thrombogenic model proteins to develop the methodology, further extended to investigate polymer thrombogenicity. Electron microscopy and immunofluorescence labelling were used to validate the QCM-D data and confirmed the relevance of Df–t plots to discriminate the activation rate among protein-modified surfaces. The responses showed the predominant role of surface hydrophobicity and roughness towards platelet activation and thereby towards polymer thrombogenicity. Modelling experimental data obtained with QCM-D with a Matlab code allowed us to define the rate at which mass change occurs (A/B), to obtain an A/B value for each polymer and correlate this value with polymer thrombogenicity.  相似文献   

6.
The convenience and accessibility of mobile banking applications has resulted in this becoming the preferred method of banking in the UK. Although popular amongst a younger generation, uptake is significantly lower among the older generation of those aged 55+, with some attributing this to cyber security and privacy concerns.This study proposes a model that can be used to measure the influence of cyber security factors on intention to use mobile banking applications, in the UK 55+. The unified theory of acceptance and use of technology (UTAUT) model was modified to include perceived cyber security risk, perceived cyber security trust, and perceived overall cyber security. Unlike similar studies which have been solely quantitative, this research brings further insight using a mixed-methods approach which harnesses both qualitative and quantitative data.The research model was tested using partial-least-squares structural equation modelling on coded questionnaire data, collected from 191 participants. Qualitative data was analysed through a thematic analysis. Both sets of data were analysed using a final convergent mixed-method. The results show that performance expectancy followed by perceived cyber security risk are the main determinants of intention to use mobile banking applications in the UK 55+.  相似文献   

7.
Safety assessment based on conventional tools (e.g. probability risk assessment (PRA)) may not be well suited for dealing with systems having a high level of uncertainty, particularly in the feasibility and concept design stages of a maritime or offshore system. By contrast, a safety model using fuzzy logic approach employing fuzzy IF–THEN rules can model the qualitative aspects of human knowledge and reasoning processes without employing precise quantitative analyses. A fuzzy-logic-based approach may be more appropriately used to carry out risk analysis in the initial design stages. This provides a tool for working directly with the linguistic terms commonly used in carrying out safety assessment. This research focuses on the development and representation of linguistic variables to model risk levels subjectively. These variables are then quantified using fuzzy sets. In this paper, the development of a safety model using fuzzy logic approach for modelling various design variables for maritime and offshore safety based decision making in the concept design stage is presented. An example is used to illustrate the proposed approach.  相似文献   

8.
Despite continuous progresses in research and applications, one of the major weaknesses of current HRA methods dwells in their limited capability of modelling the mutual influences between performance shaping factors (PSFs). Indeed at least two types of dependencies between PSFs can be defined: (i) dependency between the states of the PSFs; (ii) dependency between the influences (impacts) of the PSFs on the human performance. This paper introduces a method, based on Analytic Network Process (ANP), for the quantification of the latter, where the overall contribution of each PSF (weight) to the human error probability (HEP) is eventually returned. The core of the method is the modelling process, articulated into two steps: firstly, a qualitative network of dependencies between PSFs is identified, then, the importance of each PSF is quantitatively assessed using ANP. The model allows to distinguish two components of the PSF influence: direct influence that is the influence that the considered PSF is able to express by itself, notwithstanding the presence of other PSFs and indirect influence that is the incremental influence of the considered PSF through its influence on other PSFs. A case study in Air Traffic Control is presented where the proposed approach is integrated into the cognitive simulator PROCOS. The results demonstrated a significant modification of the influence of PSFs over the operator performance when dependencies are taken into account, underlining the importance of considering not only the possible correlation between the states of PSFs but also their mutual dependency in affecting human performance in complex systems.  相似文献   

9.
Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this field a foolproof and productive theoretical framework. Consequently, the aim of this article is to suggest potential paths of action and to provide information on EDF's progress along those paths which enables us to produce the most potentially useful Human Reliability analyses while taking into account current knowledge in Human Sciences.The second part of this article illustrates our point of view as EDF researchers through the analysis of the most famous civil nuclear accident, the Three Mile Island unit accident in 1979. Analysis of this accident allowed us to validate our positions regarding the need to move, in the case of an accident, from the concept of human error to that of systemic failure in the operation of systems such as a nuclear power plant. These concepts rely heavily on the notion of distributed cognition and we will explain how we applied it. These concepts were implemented in the MERMOS Human Reliability Probabilistic Assessment methods used in the latest EDF Probabilistic Human Reliability Assessment. Besides the fact that it is not very productive to focus exclusively on individual psychological error, the design of the MERMOS method and its implementation have confirmed two things: the significance of qualitative data collection for Human Reliability, and the central role held by Human Reliability experts in building knowledge about emergency operation, which in effect consists of Human Reliability data collection. The latest conclusion derived from the implementation of MERMOS is that, considering the difficulty in building ‘generic’ Human Reliability data in the field we are involved in, the best data for the analyst consist of the knowledge built up through already existing probabilistic analyses.  相似文献   

10.
提出了一个新的飞行机组人为差错风险评估方法F HECA,识别和分析航空公司的机组人为差错风险。该方法在对机组人为差错类型进行总结和分类的基础上,选取人为差错严重度作为评价指标,将该指标中的3个变量人为差错概率、人为差错后果严重度、人为差错影响概率与灰色综合评价法结合,定量评估机组人为差错的严重度,实现机组人为差错风险评估。通过实例分析验证此方法可用于分析机组人为差错风险,为飞行训练和飞行操作手册内容的改进提供技术支持,是民航人为因素研究中可借鉴的一种方法。  相似文献   

11.
The majority of accidents in hazardous activities are caused by human error. This problem is not new, and a good deal of research, application, and development of practical techniques for the analysis, prediction and reduction of human errors or their negative effects, has occurred in a range of industries. Whilst human error within flight operations has for some time been the centre of exhaustive research and debate, a similar analysis within the field of air traffic management (ATM) is not so comprehensive. Therefore, it may be that ATM can learn from other industries.This paper deals with an approach to ATM incident analysis that is being developed in the European ATM arena. This new approach aims to determine how and why human errors are contributing to incidents, and thus how to improve human reliability within a high-reliability system. This developing approach is called ‘HERA’ — Human Error in ATM Project.The paper reports on a formative part, or phase 1, of a project that reviewed the theoretical and practical literature to determine the best conceptual framework upon which to base an ATM incident analysis tool. The conceptual framework chosen is that of human performance from an information processing perspective, which has been adapted to make it more contextually relevant to ATM. A prototype structure was adopted for a technique with which to analyse ATM incidents. This paper summarises the review of the literature surveyed, and briefly describes the structure of the prototype technique.  相似文献   

12.
In this study, the performance of the counter flow type vortex tube with the input parameters including the nozzle number (N), the densities of inlet gases (air, oxygen, nitrogen, and argon) and the inlet pressure (Pinlet) has been modeled with the proposed hybrid method combining a novel data preprocessing called output dependent feature scaling (ODFS) and adaptive network based fuzzy inference system (ANFIS) by using the experimentally obtained data. In the developed system, output parameter temperature gradient between the cold and hot outlets has been determined using input parameters comprising (Pinlet), (N), and the density of gases. In order to evaluate the performance of hybrid method, the mean absolute error (MAE), mean square error (MSE), root mean square error (RMSE), determination coefficient (R2), and Index of Agreement (IA) values have been used. The obtained results are 9.0670e-004 (MAE), 5.8563e-006 (MSE), 0.0024 (RMSE), 1.00 (R2), and 1.00 (IA) using the hybrid method.  相似文献   

13.
Lean principles have long been recognised as a competitive advantage. Although there are several measures for various aspects of lean production in the literature, there is no comprehensive measure for overall lean implementation in business firms. An appropriate measurement tool is needed to assess the effectiveness and efficiency of the lean implementation throughout the entire organisation. Based on lean research, a comprehensive tool called the leanness assessment tool (LAT) is developed, using both quantitative (directly measurable and objective) and qualitative (perceptions of individuals) approaches to assess lean implementation. The LAT measures leanness using eight quantitative performance dimensions: time effectiveness, quality, process, cost, human resources, delivery, customer and inventory. The LAT also uses five qualitative performance dimensions: quality, process, customer, human resources and delivery, with 51 evaluation items. The fuzzy method allows managers to identify improvement needs in lean implementation, and the use of radar charts allows an immediate, comprehensive view of strong areas and those needing improvement. Practical uses of the LAT are discussed in the conclusion, along with possible limitations.  相似文献   

14.
The velocity field in shape sensitivity analysis is not uniquely defined although it must meet numerous theoretical and practical criteria. These practical criteria can be used to compare the existing velocity field computation methods which meet the theoretical criteria, but only in qualitative terms. When the FEM is used in design sensitivity analysis (DSA), due to the FE discretization error, the DSA errors will depend on the design velocity field considered. This paper presents a numerical methodology for quality evaluation of design velocity field computation methods in quantitative terms based on the analysis of the DSA discretization error. The sensitivity of the squared energy norm (χm = ?∥u∥2/?am, am being a design variable) has been taken as the magnitude to measure the error of the DSA. For h‐refinements, the squared error in energy norm (∥ e ( u ) ∥2) and the error in χm(e(χm)) are theoretically related by a constant which is independent of the refinement degree of the FE model. The quality of the design velocity field computation methods can therefore be assessed in terms of the stability of e(χm) /∥e(u) ∥2 in sequences of meshes. An example of the use of this methodology, where six design velocity field computation methods are compared, is presented. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

15.
A new evaluation method called EFSA (error function superposition approximation) is proposed, to obtain depth resolution functions on any multilayer structure in agreement with the conventional absolute depth resolution. The quantitative evaluation of Δz measured on multilayer samples helps in clearing up the relations of different interface broadening effects and makes the evaluation independent of the measuring system. It is shown that this method can be successfully applied to both SIMS and AES depth profile measurements and that it has proved to be useful in comparing results obtained in different laboratories.  相似文献   

16.
The rise of intelligent technological devices (ITDs)—wearables and insideables—provides the possibility of enhancing human capabilities and skills. This study contributes to the literature on the impact of ethical judgements on the acceptance of ITDs by using a multidimensional ethical scale (MES) proposed by Shwayer and Sennetti. The novelty of this study resides in using fuzzy set qualitative comparative analysis (fsQCA) instead of correlational methods to explain human behaviour (in this case, attitudes towards ITDs) from an ethical perspective. fsQCA evaluates the influence of ethical variables on the intention to use ITDs (and the non-use of these technologies). Positive ethical evaluations of technology do not always ensure ITD acceptance—unfavourable ethical perceptions may lead to its rejection. We find that for wearables: (1) positive perceptions from a utilitarian perspective are key in explaining their acceptance. Likewise, we identify configurations leading to acceptance where positive judgements on moral equity, egoism and contractualism are needed. Surprisingly, only the relativism dimension participates in configurations that cause acceptance when it is negated; (2) We found that a single unfavourable perception from a contractualism or relativism perspective causes non-use. Likewise, we found that coupling of negative judgements on moral equity, utilitarianism and egoism dimensions also produce resistance to wearables. For insideables, we notice that: (1) an MES has weak explanatory power for the intention to use ITDs but is effective in understanding resistance to use; (2) A negative perception of any ethical dimension leads to resistance towards insideables.  相似文献   

17.
We study the idempotence of operators of the form ?∨id∧δ (where ?≤δ and both ? and δ are increasing) on a modular lattice ?, in relation to the idempotence of the operators ?∨id and id∧δ. We consider also the conditions under which ?∨id∧δ is the composition of ?∨id and id∧δ. The case where δ is a dilation and ? an erosion is of special interest. When ? is a complete lattice on which Minkowski operations can be defined, we obtain very precise conditions for the idempotence of these operators. Here id∧δ is called an annular opening, ?∨id is called an annular closing, and ?∨id∧δ is called an annular filter. Our theory can be applied to the design of idempotent morphological filters removing isolated spots in digital pictures.  相似文献   

18.
This paper presents a model to assess the contribution of Human and Organizational Factor (HOF) to accidents. The proposed model is made up of two phases. The first phase is the qualitative analysis of HOF responsible for accidents, which utilizes Human Factors Analysis and Classification System (HFACS) to seek out latent HOFs. The hierarchy of HOFs identified in the first phase provides inputs for the analysis in the second phase, which is a quantitative analysis using Bayesian Network (BN). BN enhances the ability of HFACS by allowing investigators or domain experts to measure the degree of relationships among the HOFs. In order to estimate the conditional probabilities of BN, fuzzy analytical hierarchy process and decomposition method are applied in the model. Case studies show that the model is capable of seeking out critical latent human and organizational errors and carrying out quantitative analysis of accidents. Thereafter, corresponding safety prevention measures are derived.  相似文献   

19.
The results of a numerical model study of quasi-static crack growth are reported. This study made use of an elastic-plastic stringer sheet in which the stringers were broken in an appropriate manner to simulate crack growth. The stringer-sheet model was used to demonstrate the qualitative effect of material properties, specimen geometry, initial crack length, and type of applied load on crack growth. In addition, stringer sheet analogs were constructed for both the R-curve concept of linear elastic fracture mechanics and for a modification of this concept in which K is replaced by the J-integral. The modified R curve was called the Rp curve. Calculated stringer sheet R and Rp curves were not material properties, but were influenced by the extent of plastic yielding during stable crack growth. In general, however, the Rp curve provided a somewhat better correlation of crack growth than the R curve.  相似文献   

20.
Abstract

Divergent laser illumination is commonly used in current designs of commercial electronic speckle pattern shearing interferometry (ESPSI) or shearography, for qualitative non-destructive testing (NDT) of material defects. The growing demand for quantitative out-of-plane (OOP) and more recently in-plane (IP) ESPSI, is determining the quality of optical system design and analysis software. However, little attention is currently being given to understanding, quantifying and compensating for the numerous error sources. Data describing the measurement inaccuracy due to the divergence of the object illumination wavefront for an OOP interferometer is presented. The errors are measured by comparing divergent object illumination with collimated illumination, with respect to illumination angle, lateral shear and shearing direction. Results indicate that the magnitude of the relative error increases by approximately a power function as the distance from the illumination source decreases.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号