首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 10 毫秒
1.
Fuzzy coding vs. crisp coding and then local coding vs. global coding is proposed to transform a quantitative scale into a category scale. Such a transformation technique is seen as the most general one to investigate either heterogeneous but quantitative variables or variables with different scale models (both quantitative and qualitative). A major point of fuzzy coding is that space modalities can be built very early in the statistical analysis process and from a discussion between several specialists. The multiple correspondence analysis (MCA) is proposed to investigate a table where the data come from fuzzy coding; the table rows corresponding to the empirical situations and the columns to the space modalities of the respective variables. Two examples are considered. First, a didactic data set is designed in order to compare the principal component analysis, the MCA with crisp coding and the MCA with fuzzy coding. Second, an example about a sitting posture study is considered in order to show the possibility of achieving relationships between objective and subjective data. The empirical situations correspond to adjustment combinations of the seat, the table and the backrest; the variables are posture indicators and subjective assessments. The main result is that the subjective variables have a much more consistent evolution with the adjustments than the objective ones. Consequently, there is a poor connection between these two sets of variables. The backrest is the furniture setting with the highest influence. From the interpretation of the MCA factor planes, it is possible to find the best and the worst adjustment combinations.  相似文献   

2.
User experience is the focus of interaction design, and designing for errors is crucial for improving user experience. One method of designing for errors is to identify human errors and then initiate corrective actions on high-risk errors to reduce their adverse effects. In this study, we proposed a hybrid approach for risk analysis of human error concerning user experience of interactive systems. In this approach, systematic human error reduction and prediction approach (SHERPA) is first adopted to identify human error concerning user experience. Subsequently, failure mode and effect analysis (FMEA) is used to analyze the risk factors of the error, including occurrence, severity, and detection. Fuzzy technique for order preference by similarity to ideal solution (TOPSIS) is then used to calculate the risk priority number to rank the errors. Finally, corrective actions for high-risk errors are recommended. An in-vehicle information system was used to demonstrate the proposed approach. The results indicated that the proposed approach can effectively analyze the risk of human error concerning user experience and be used as a universal reliability approach for improving user experience in interaction design.  相似文献   

3.
When a system is tested, besides system data, some lower-level data may become available such as a particular subsystem or component was successful or failed. Treating such simultaneous multi-level data as independent is a mistake because they are dependent. In this paper, we show how to handle simultaneous multi-level data correctly in a reliability assessment. We do this by determining what information the simultaneous data provides in terms of the component reliabilities using generalized cut sets. We illustrate this methodology with an example of a low-pressure coolant injection system using a Bayesian approach to make reliability assessments.  相似文献   

4.
Modeling uncertainty during risk assessment is a vital component for effective decision making. Unfortunately, most of the risk assessment studies suffer from uncertainty analysis. The development of tools and techniques for capturing uncertainty in risk assessment is ongoing and there has been a substantial growth in this respect in health risk assessment. In this study, the cross-disciplinary approaches for uncertainty analyses are identified and a modified approach suitable for industrial safety risk assessment is proposed using fuzzy set theory and Monte Carlo simulation. The proposed method is applied to a benzene extraction unit (BEU) of a chemical plant. The case study results show that the proposed method provides better measure of uncertainty than the existing methods as unlike traditional risk analysis method this approach takes into account both variability and uncertainty of information into risk calculation, and instead of a single risk value this approach provides interval value of risk values for a given percentile of risk. The implications of these results in terms of risk control and regulatory compliances are also discussed.  相似文献   

5.
The rise of intelligent technological devices (ITDs)—wearables and insideables—provides the possibility of enhancing human capabilities and skills. This study contributes to the literature on the impact of ethical judgements on the acceptance of ITDs by using a multidimensional ethical scale (MES) proposed by Shwayer and Sennetti. The novelty of this study resides in using fuzzy set qualitative comparative analysis (fsQCA) instead of correlational methods to explain human behaviour (in this case, attitudes towards ITDs) from an ethical perspective. fsQCA evaluates the influence of ethical variables on the intention to use ITDs (and the non-use of these technologies). Positive ethical evaluations of technology do not always ensure ITD acceptance—unfavourable ethical perceptions may lead to its rejection. We find that for wearables: (1) positive perceptions from a utilitarian perspective are key in explaining their acceptance. Likewise, we identify configurations leading to acceptance where positive judgements on moral equity, egoism and contractualism are needed. Surprisingly, only the relativism dimension participates in configurations that cause acceptance when it is negated; (2) We found that a single unfavourable perception from a contractualism or relativism perspective causes non-use. Likewise, we found that coupling of negative judgements on moral equity, utilitarianism and egoism dimensions also produce resistance to wearables. For insideables, we notice that: (1) an MES has weak explanatory power for the intention to use ITDs but is effective in understanding resistance to use; (2) A negative perception of any ethical dimension leads to resistance towards insideables.  相似文献   

6.
7.
There is a need to extend and refine the use of crash surrogates to enhance safety analyses. This is particularly true given opportunities for data collection presented by naturalistic driving studies. This paper connects the original research on traffic conflicts to the contemporary literature concerning crash surrogates using the crash-to-surrogate ratio, π. A conceptual structure is developed in which the ratio can be estimated using either a Logit or Probit formulation which captures context and event variables as predictors in the model specification. This allows the expansion of the crash-to-surrogate concept beyond traffic conflicts to many contexts and crash types.  相似文献   

8.
An anecdotal introduction of the role of operational definitions in representing uncertainty is followed by a brief history of operational definitions, with particular attention to the foundations of probability and the fuzzy representations of uncertainty. A short summary of experience at the TU Delft points to relevant open questions. These in turn are illustrated by a recent application to NOx emissions in The Netherlands.  相似文献   

9.
Unfolded partial least-squares in combination with residual quadrilinearization (U-PLS/RQL), is developed as a new latent structured algorithm for the processing of fourth-order instrumental data. In order to check its analytical predictive ability, fluorescence excitation-emission-kinetic-pH data were measured and processed. The concentration of the fluorescent pesticide carbaryl was determined in the presence of the pesticides fuberidazole and thiabendazole as uncalibrated interferents, in the first example of fourth-order multivariate calibration. The hydrolysis of the analyte was followed at different pH values using a fast-scanning spectrofluorimeter, recording the excitation-emission fluorescence matrices during its evolution to produce 1-naphthol, which does also emit fluorescence. A set of test samples containing the above mentioned fluorescent contaminants was analyzed with the new model, comparing the results with those from parallel factor analysis (PARAFAC). The newly developed U-PLS/RQL model provides better figures of merit for analyte quantitation (average prediction error, 7 μg L−1, relative prediction error, 5%, calibration range, 50-250 μg L−1), and is considerably simpler than PARAFAC in its implementation. The latter, however, furnishes important physicochemical information regarding the chemical process under study, although this requires the data to be unfolded into an array of lower dimensions, due to the lack of quadrilinearity of the experimental data.  相似文献   

10.
Fatal motor vehicle intersection crashes occurring in Norway in the years 2005–2007 were analyzed to identify causation patterns among their underlying contributing factors, and also to assess if the data collection and documentation procedures used by the Norwegian in-depth investigation teams produces the information necessary to do causation pattern analysis. 28 fatal accidents were analyzed. Causation charts of contributing factors were first coded for each driver in each crash using the Driving Reliability and Error Analysis Method (DREAM). Next, the charts were aggregated based on a combination of conflict types and whether the driver was going straight or turning. Analysis results indicate that drivers who were performing a turning maneuver in these crashes faced perception difficulties and unexpected behavior from the primary conflict vehicle, while at the same time trying to negotiate a demanding traffic situation. Drivers who were going straight on the other hand had less perception difficulties but largely expect any turning drivers to yield, which led to either slow reaction or no reaction at all. In terms of common contributing factors, those often pointed to in literature as contributing to fatal crashes, e.g. high speed, drugs and/or alcohol and inadequate driver training, contributed in 12 of 28 accidents. This confirms their prevalence, but also shows that most drivers end up in these situations due to combinations of less auspicious contributing factors. In terms of data collection and documentation, there was an asymmetry in terms of reported obstructions to view due to signposts and vegetation. These were frequently reported as contributing for turning drivers, but rarely reported as contributing for their counterparts in the same crashes. This probably reflects an involuntary focus of the analyst on identifying contributing factors for the driver held legally liable, while less attention is paid to the driver judged not at fault. Since who to blame often is irrelevant from a countermeasure development point of view, this underlying investigator approach needs to be addressed to avoid future bias in crash investigation reports.  相似文献   

11.
In this paper, we consider a framework of data envelopment analysis (DEA) to measure the overall profit efficiency of decision-making units (DMUs) subject to inputs and outputs uncertainty. Under uncertain conditions, classic methods can lead to unrealistic solutions in practice. In this work, robust optimization is proposed to incorporate uncertainty into measuring the overall profit efficiency. In a robust optimization model, it is supposed that uncertain parameters belong to a specified set with a solution that is efficient for all possible uncertainty outcomes while it is not optimal for a given value of the parameters. We show that the overall profit efficiency score may not always occur in an optimistic case and the decision maker can obtain the overall profit efficiency score corresponding to a value in the uncertainty set. The results of the experiment on bank data show that a robust overall profit efficiency score provides a significant improvement in the performance, as the uncertainty increases.

Abbreviations: DEA: data envelopment analysis; DMUs: decision-making units; CRS: constant returns to scale; VRS: variable returns to scale; ROP: robust optimization problem; RC: robust counterpart; ROPE: robust overall profit efficiency; OOPE: optimistic overall profit efficiency; GAMS: generalized algebraic modeling system  相似文献   


12.
The performance of Radio‐Isotope IDentification (RIID) algorithms using gamma spectroscopy is increasingly becoming important. For example, sensors at locations that screen for illicit nuclear material rely on isotope identification to resolve innocent nuisance alarms arising from naturally occurring radioactive material. Recent data collections for RIID testing consist of repeat measurements for each of several scenarios to test RIID algorithms. Efficient allocation of measurement resources requires an appropriate number of repeats for each scenario. To help allocate measurement resources in such data collections for RIID algorithm testing, we consider using only a few real repeats per scenario. In order to reduce uncertainty in the estimated RIID algorithm performance for each scenario, the potential merit of augmenting these real repeats with realistic synthetic repeats is also considered. Our results suggest that for the scenarios and algorithms considered, approximately 10 real repeats augmented with simulated repeats will result in an estimate having comparable uncertainty to the estimate based on using 60 real repeats. Published in 2009 by John Wiley & Sons, Ltd.  相似文献   

13.
This paper investigates the interaction between rapid granular flow and an obstacle. The distinct element method (DEM) is used to simulate the flow regimes observed in laboratory experiments. The relationship between the particle properties and the overall flow behaviour is obtained by using the DEM with a simple linear contact model. The flow regime is primarily controlled by the particle friction, viscous normal damping and particle rotation rather than the contact stiffness. Rolling constriction is introduced to account for dispersive flow. The velocity depth-profiles around the obstacles are not uniform but varying over the depth. The numerical results are compared with laboratory experiments of chute flow with dry granular material. Some important model parameters are obtained, which can be used to optimize defense structures in alpine regions.  相似文献   

14.
Quantifying uncertainty during risk analysis has become an important part of effective decision-making and health risk assessment. However, most risk assessment studies struggle with uncertainty analysis and yet uncertainty with respect to model parameter values is of primary importance. Capturing uncertainty in risk assessment is vital in order to perform a sound risk analysis. In this paper, an approach to uncertainty analysis based on the fuzzy set theory and the Monte Carlo simulation is proposed. The question then arises as to how these two modes of representation of uncertainty can be combined for the purpose of estimating risk. The proposed method is applied to a propylene oxide polymerisation reactor. It takes into account both stochastic and epistemic uncertainties in the risk calculation. This study explores areas where random and fuzzy logic models may be applied to improve risk assessment in industrial plants with a dynamic system (change over time). It discusses the methodology and the process involved when using random and fuzzy logic systems for risk management.  相似文献   

15.
16.
17.
We model a value of statistical life (VSL) transfer function for application to road-safety engineering in developing countries through an income-disaggregated meta-analysis of scope-sensitive stated preference VSL data. The income-disaggregated meta-analysis treats developing country and high-income country data separately. Previous transfer functions are based on aggregated datasets that are composed largely of data from high-income countries. Recent evidence, particularly with respect to the income elasticity of VSL, suggests that the aggregate approach is deficient because it does not account for a possible change in income elasticity across income levels. Our dataset (a minor update of the OECD database published in 2012) includes 123 scope-sensitive VSL estimates from developing countries and 185 scope-sensitive estimates from high-income countries. The transfer function for developing countries gives VSL = 1.3732E−4 × (GDP per capita)2.478, with VSL and GDP per capita expressed in 2005 international dollars (an international dollar being a notional currency with the same purchasing power as the U.S. dollar). The function can be applied for low- and middle-income countries with GDPs per capita above $1268 (with a data gap for very low-income countries), whereas it is not useful above a GDP per capita of about $20,000. The corresponding function built using high-income country data is VSL = 8.2474E+3 × (GDP per capita).6932; it is valid for high-income countries but over-estimates VSL for low- and middle-income countries. The research finds two principal significant differences between the transfer functions modeled using developing-country and high-income-country data, supporting the disaggregated approach. The first of these differences relates to between-country VSL income elasticity, which is 2.478 for the developing country function and .693 for the high-income function; the difference is significant at p < 0.001. This difference was recently postulated but not analyzed by other researchers. The second difference is that the traffic-risk context affects VSL negatively in developing countries and positively in high-income countries. The research quantifies uncertainty in the transfer function using parameters of the non-absolute distribution of relative transfer errors. The low- and middle-income function is unbiased, with a median relative transfer error of −.05 (95% CI: −.15 to .03), a 25th percentile error of −.22 (95% CI: −.29 to −.19), and a 75th percentile error of .20 (95% CI: .14 to .30). The quantified uncertainty characteristics support evidence-based approaches to sensitivity analysis and probabilistic risk analysis of economic performance measures for road-safety investments.  相似文献   

18.
The least-squares analysis of data with error in x and y is generally thought to yield best results when the quantity minimized is the sum of the properly weighted squared residuals in x and in y. As an alternative to this “total variance” (TV) method, “effective variance” (EV) methods convert the uncertainty in x into an effective contribution to that in y, and though easier to use are considered to be less reliable. There are at least two EV methods, differing in how the weights are treated in the optimization. One of these is identical to the TV method for fits to a straight line. The formal differences among these methods are clarified, and Monte Carlo simulations are used to examine the statistical properties of each on the widely used straight-line model of York, a quadratic variation on this, Orear's hyperbolic model, a nonlinear binding (Langmuir) model, and Wentworth's kinetics model. The simulations confirm that the EV and TV methods are statistically equivalent in the limit of small data error, where they yield unbiased, normally distributed parameter estimates, with standard errors correctly predicted by the a priori covariance matrix. With increasing data error, these properties fail to hold; and the TV method is not always statistically best. Nonetheless, the method differences should seldom be of practical significance, since they are likely to be small compared with uncertainties from incomplete information about the data error in x and y.  相似文献   

19.
In this study, we investigated and assessed the dependence of dummy head injury mitigation on the side curtain airbag and occupant distance under a side impact of a Dodge Neon. Full-scale finite element vehicle simulations of a Dodge Neon with a side curtain airbag were performed to simulate the side impact. Owing to the wide range of parameters, an optimal matrix of finite element calculations was generated using the design method of experiments (DOE); the DOE method was performed to independently screen the finite element results and yield the desired parametric influences as outputs. Also, analysis of variance (ANOVA) techniques were used to analyze the finite element results data. The results clearly show that the influence of moving deformable barrier (MDB) strike velocity was the strongest influence parameter on both cases for the head injury criteria (HIC36) and the peak head acceleration, followed by the initial airbag inlet temperature. Interestingly, the initial airbag inlet temperature was only a ~30% smaller influence than the MDB velocity; also, the trigger time was a ~54% smaller influence than the MDB velocity when considering the peak head accelerations. Considering the wide range in MDB velocities used in this study, results of the study present an opportunity for design optimization using the different parameters to help mitigate occupant injury. As such, the initial airbag inlet temperature, the trigger time, and the airbag pressure should be incorporated into vehicular design process when optimizing for the head injury criteria.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号