共查询到19条相似文献,搜索用时 15 毫秒
1.
Exploring the significant variables related to specific types of crashes is vitally important in the planning stage of a transportation network. This paper aims to identify and examine important variables associated with total crashes and severe crashes per traffic analysis zone (TAZ) in four counties of the state of Florida by applying nonparametric statistical techniques such as data mining and random forest. The intention of investigating these factors in such aggregate level analysis is to incorporate proactive safety measures in transportation planning. Total and severe crashes per TAZ were modeled to provide predictive decision trees. The variables which carried higher weight of importance for total crashes per TAZ were – total number of intersections per TAZ, airport trip productions, light truck productions, and total roadway segment length with 35 mph posted speed limit. The other significant variables identified for total crashes were total roadway length with 15 mph posted speed limit, total roadway length with 65 mph posted speed limit, and non-home based work productions. For severe crashes, total number of intersections per TAZ, light truck productions, total roadway length with 35 mph posted speed limit, and total roadway length with 65 mph posted speed limit were among the significant variables. These variables were further verified and supported by the random forest results. 相似文献
2.
The objective of this study was to evaluate the impact of Winnipeg's photo enforcement safety program on speeding, i.e., “speed on green”, and red-light running behavior at intersections as well as on crashes resulting from these behaviors. ARIMA time series analyses regarding crashes related to red-light running (right-angle crashes and rear-end crashes) and crashes related to speeding (injury crashes and property damage only crashes) occurring at intersections were conducted using monthly crash counts from 1994 to 2008. A quasi-experimental intersection camera experiment was also conducted using roadside data on speeding and red-light running behavior at intersections. These data were analyzed using logistic regression analysis. The time series analyses showed that for crashes related to red-light running, there had been a 46% decrease in right-angle crashes at camera intersections, but that there had also been an initial 42% increase in rear-end crashes. For crashes related to speeding, analyses revealed that the installation of cameras was not associated with increases or decreases in crashes. Results of the intersection camera experiment show that there were significantly fewer red light running violations at intersections after installation of cameras and that photo enforcement had a protective effect on speeding behavior at intersections. However, the data also suggest photo enforcement may be less effective in preventing serious speeding violations at intersections. Overall, Winnipeg's photo enforcement safety program had a positive net effect on traffic safety. Results from both the ARIMA time series and the quasi-experimental design corroborate one another. However, the protective effect of photo enforcement is not equally pronounced across different conditions so further monitoring is required to improve the delivery of this measure. Results from this study as well as limitations are discussed. 相似文献
3.
Thomas H. Johnson Rebecca M. Medlin Laura Freeman 《Quality and Reliability Engineering International》2019,35(6):1666-1675
Reliability experiments determine which factors drive product reliability. Often, the reliability or lifetime data collected in these experiments tend to follow distinctly non‐normal distributions and typically include censored observations. The experimental design should accommodate the skewed nature of the response and allow for censored observations, which occur when products do not fail within the allotted test time. To account for these design and analysis considerations, Monte‐Carlo simulations are frequently used to evaluate experimental design properties. Simulation provides accurate power calculations as a function of sample size, allowing researchers to determine adequate sample sizes at each level of the treatment. However, simulation may be inefficient for comparing multiple experiments of various sizes. We present a closed‐form approach for calculating power, based on the noncentral chi‐squared approximation to the distribution of the likelihood ratio statistic for large samples. The solution can be used to rapidly compare multiple designs and accommodate trade‐space analyses between power, effect size, model formulation, sample size, censoring rates, and design type. To demonstrate the efficiency of our approach, we provide a comparison to estimates from simulation. 相似文献
4.
Uncertainties and quantification of common cause failure rates and probabilities for system analyses 总被引:2,自引:0,他引:2
Jussi K. Vaurio 《Reliability Engineering & System Safety》2005,90(2-3):186-195
Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology. 相似文献
5.
Aram Soroushian Peter Wriggers Jamshid Farjoodi 《International journal for numerical methods in engineering》2009,80(5):565-595
The results produced by Richardson extrapolation, though, in general, very accurate, are inexact. Numerical evaluation of this inexactness and implementation of the evaluation in practice are the objectives of this paper. First, considering linear changes of errors in the convergence plots, asymptotic upper bounds are proposed for the errors. Then, the achievement is extended to the results produced by Richardson extrapolation, and finally, an error‐controlling procedure is proposed and successfully implemented in approximate computations originated in science and engineering. Copyright © 2009 John Wiley & Sons, Ltd. 相似文献
6.
Pharmaceutical quality systems use various inputs to ensure product quality and prevent failures that might have patient consequences. These inputs are generally data from failures that have already occurred, for example process deviations or customer complaints. Risk analysis techniques are well-established in certain other industries and have become of interest to pharmaceutical manufacturers because they allow potential quality failures to be predicted and mitigating action taken in advance of their occurring. Failure mode and effects analysis (FMEA) is one such technique, and in this study it was applied to implement a computerized manufacturing execution system in a pharmaceutical manufacturing environment. After introduction, the system was monitored to detect failures that did occur and these were analyzed to determine why the risk analysis method failed to predict them. Application of FMEA in other industries has identified weaknesses in predicting certain error types, specifically its dependence on other techniques to model risk situations and its poor analysis of non-hardware risks, such as human error, and this was confirmed in this study. Hierarchical holographic modeling (HHM), a technique for identifying risk scenarios in wide-scope analyses, was applied subsequently and identified additional potential failure modes. The technique for human error rate prediction (THERP) has previously been used for the quantitative analysis of human error risk and the event tree from this technique was adapted and identified further human error scenarios. These were input to the FMEA for prioritization and mitigation, thereby strengthening the risk analysis in terms of failure modes considered. 相似文献
7.
Automated manufacturability evaluation of a given design is a key requirement in realizing complete integration of design and process planning. It would still be better to control the designing process itself with manufacturability information. The purpose of such an evaluation is to assist designers in their effort to come up with manufacturable parts economizing in terms of cost and time, without compromising on quality and functional requirements. The present work deals with one such system developed for manufacturability evaluation of sheet metal components. Unlike most of the work done in the sheet metal area in the past, which concentrates on specific domain or phase of manufacturability evaluation, the present work is more comprehensive combining characteristics of all the existing methods and phases of manufacturability evaluation. The prime components of the present system are design evaluation and process plan generation. Design evaluator and process planner use different types of data and knowledge to identify design and process planning violations which are overcome by suggesting design changes. A process planner also uses manufacturing resource and process information to arrive at manufacturable parts by generating feasible process plans. Results of manufacturability evaluation are presented for typical sheet metal parts to be produced by bending and shearing (blanking and piercing) processes. 相似文献
8.
J. J. Ródenas G. Bugeda J. Albelda E. Oñate 《International journal for numerical methods in engineering》2011,87(11):1105-1126
This work analyzes the influence of the discretization error associated with the finite element (FE) analyses of each design configuration proposed by the structural shape optimization algorithms over the behavior of the algorithm. The paper clearly shows that if FE analyses are not accurate enough, the final solution provided by the optimization algorithm will neither be optimal nor satisfy the constraints. The need for the use of adaptive FE analysis techniques in shape optimum design will be shown. The paper proposes the combination of two strategies to reduce the computational cost related to the use of mesh adaptivity in evolutionary optimization algorithms: (a) the use of an algorithm for the mesh generation by projection of the discretization error, which reduces the computational cost associated with the adaptive FE analysis of each geometrical configuration and (b) the successive increase of the required accuracy of the FE analyses in order to obtain a considerable reduction of the computational cost in the early stages of the optimization process. Copyright © 2011 John Wiley & Sons, Ltd. 相似文献
9.
Evaluation literature suggests that assessments of integrated transport plans should be an inclusive dialogue, for which it is crucial that participants communicate with and trust each other. However, cost benefit analysis (CBA) of integrated transport plans is often characterized by communication deficits and distrust among plan owners and evaluators. A literature review suggested five communication and trust-building interventions and related mechanisms that might improve this. In this paper, we have tested the efficacy of these five communication and trust-building interventions by applying them in an experiential study with two sequential cases, representing ‘close to real’ situations. The research aimed to develop field-tested knowledge to address the aforementioned class of CBA process problems. The research demonstrated how the five interventions could facilitate an exchange of information, knowledge and experiences, which – according to the participants – will increase the effectiveness of the CBA. Furthermore, it illustrated that a communication and trust-building strategy such as the one tested might be a useful complement to CBA practices, if adapted to the characteristics of the specific assessment process and planning context. 相似文献
10.
Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. 相似文献
11.
Y. B. Park 《国际生产研究杂志》2013,51(6):1205-1224
Many firms have been trying to optimize their production and distribution systems separately, but using this approach limits any possible increase in profit. Thus, it is becoming more important to analyse these two systems simultaneously. This paper presents the solutions for integrated production and distribution planning and investigates the effectiveness of their integration through a computational study, in a multi-plant, multi-retailer, multi-item, and multi-period logistic environment where the objective is to maximize the total net profit. Computational results on test problems using the proposed heuristic confirm the substantial advantage of the integrated planning approach over the decoupled one. Sensitivity analysis on the input parameters indicates that, under the right conditions, the effectiveness of integrating production and distribution functions can be extremely high. 相似文献
12.
A direct energy balance approach has been developed and used to determine energy release rates in three and four point bend
end notched flexure tests. This study was performed in the context of the larger goal of understanding the wide variation
in mode II toughnesses that have been obtained by the two tests when used on the same material. The primary motivation for
developing the direct energy balance approach was to fully account for the effects of friction, large deformations, and other
geometric nonlinearities that occur during these tests. The direct energy balance approach simulates crack advance as it occurs
in physical testing. Most significantly, this approach accounts for frictional dissipation that occurs during crack advance,
which is an effect that has been neglected in previous analyses of these tests. The direct energy balance approach is used
to show that, for most cases of practical interest, the virtual crack closure technique is quite accurate, and predictions
by this latter approach are only in error when moderately large geometric nonlinearities occur prior to crack advance. Based
on these results, a “cut-off value,” expressed in terms of the maximum slope in the specimen as predicted by classical beam
theory, is suggested for the upper limit of applicability of the virtual crack closure technique. 相似文献
13.
To explore constraint effects on fully plastic crakc-tip fields, analytical solutions are examined for mode-I, II and III loading in non-hardening materials under plane strain conditions. The results reveal that under mode-II and III loading the crack-tip stress fields are unique, and thus can be characterized by a `single parameter'. Under mode-I loading, however, the crack-tip stress field is non-unique but can be characterized by two sets of solutions or `two parameters'. One set of the solutions is the well-known Prandtl field and the other is a plastic T-stress field. This conclusion corroborates the observation of McClintock (1971) that the slip-line field is non-unique for plane strain tensile cracks. A two-term plastic solution which combines the Prandtl field and the plastic T-stress field with two parameters B
1 and B
2 can then characterize the crack-tip stress field of plane strain mode-I crack over the plastic region and quantify the magnitude of crack-tip constraints. These characters are similar to those for hardening materials. Analyses and examples show that the two-term plastic solution can match well with the slip-line field or finite element results over plastic region. Thus the parameters B
1 and B
2 can be used to characterize the constraint level for mode-I finite-sized crack specimens in non-hardening materials under plane strain conditions. 相似文献
14.
Xiaoqing Chen Xinwang Liu Yong Qin 《Quality and Reliability Engineering International》2021,37(1):284-308
Human error is one of the largest contributing factors to unsafe operation and accidents in high-speed train operation. As a well-known second-generation human reliability analysis (HRA) technique, the cognitive reliability and error analysis method (CREAM) has been introduced to address HRA problems in various fields. Nevertheless, current CREAM models are insufficient to deal with the HRA problem that need to consider the interdependencies between the Common Performance Conditions (CPCs) and determine the weights of these CPCs, simultaneously. Hence, the purpose of this paper is to develop a hybrid HRA model by integrating CREAM, the interval type-2 fuzzy sets, and analytic network process (ANP) to overcome this drawback. Firstly, the interval type-2 fuzzy sets are utilized to express the highly uncertain information of CPCs. Secondly, the ANP is incorporated into the CREAM to depict the interdependencies between the CPCs and determine their weights. Furthermore, human error probability (HEP) can be calculated based on the obtained weights. Finally, an illustrative example of the HRA problem in high-speed train operation is proposed to demonstrate the application and validity of the proposed HRA model. The results indicate that experts prefer to express their preferences by fuzzy sets rather than crisp values, and the interdependences between the CPCs can be better depicted in the proposed model. 相似文献
15.
Over the past 30 years, the field of thermal analysis of organic peroxides has become an important issue in chemical engineering departments, safety departments, and in companies working with polymerization, petrifaction process, and so on. The contributions of thermal analysis to the evaluation and prediction of the runaway reactions have been important for decreasing or preventing a hazard, such as fire or explosion accident. This study was carried out using differential scanning calorimetry (DSC) to evaluate the kinetic and safety parameters in isothermal and non-isothermal conditions, for instance, temperature of no return (TNR), self accelerating decomposition temperature (SADT), time to maximum rate (TMR), activation energy (Ea), frequency factor (A), reaction order (n), and reaction heat (ΔH), in terms of the hazardous material of 1,1,-di-(tert-butylperoxy)-3,3,5-trimethylcyclohexane (TMCH) 88 mass%. On the basis of this study, we demonstrated that TMCH 88 mass% must be well controlled in the manufacturing process due to the unstable structure of O-O, which releases a great quantity of heat, higher than 1300 J/g under decomposition. Results of this study could contribute to the relevant plants adopting TMCH 88 mass% in a process, in order to prevent a fire or explosion accident from happening. 相似文献
16.
《Mauerwerk》2017,21(4):209-222
Since masonry is one of the oldest and most traditional construction types, corresponding safety concepts are usually based on experience instead of being calibrated by structural reliability methods. For this reason, reliability analyses of masonry structures are needed to check if safety factors should be adjusted. Masonry is a non‐homogenous material. Because of that, it is very important to consider the spatial variability of material properties when assessing the reliability of masonry walls. Therefore, it is useful to know if and to what extent spatial variability increases or decreases the reliability of masonry walls and the required safety factors. The influence of spatial variability depends on the length of a wall due to the capability of load redistribution. Also, it is affected by the governing failure mode, which depends on the slenderness of the wall, and can be local compression or stability failure. This paper demonstrates the effect of spatial variability on the load‐bearing capacity of masonry walls in terms of mean value, scatter and design value. For this purpose, walls of varying length and slenderness were analysed with and without the consideration of spatial variability by performing Monte Carlo simulations. Based upon that, safety factors were determined which are required to meet the target reliability defined by EN 1990. 相似文献
17.
Application of the partial safety concept for complex non‐linear calculations through the example of Friedrichswerder Church in Berlin 下载免费PDF全文
Dr.‐Ing. habil. Tammam Bakeer 《Mauerwerk》2018,22(1):3-14
Non‐linear analysis of special structures is routine practice in every engineering design office. Current code provisions provide no sufficient regulations or clear procedures on how to use partial safety factors in non‐linear analyses. The present paper introduces some specific problems of masonry structures from practice. Limit state functions of vertically loaded masonry walls were investigated. A reliability study shows that the decisive case of verification can be obtained by considering a special combination of material parameters. The study also indicates the importance of considering a partial factor for the elastic modulus to account for stability failure. This proposed procedure has been applied to a practical example, the Friedrichswerder Church. The stability of the church has been checked with many FEM models at the macro level. 相似文献
18.
Galina Zilberstein 《Materials Characterization》1997,39(2-5):687-695
A method of tungsten grain size analysis in coiled lamp filaments has been developed. It is shown to be a valuable tool in design and optimization of heat treatment schedules, predictions of sag rates, failure analyses as well as standardization and benchmarking of non-sag tungsten grain structure in coiled lamp filaments. The procedure consists of two measurements on a polished and etched cross-section of a lamp coil: a number of observed “grains” and a number of examined turns. The resulting figure of merit, a grain-per-turn ratio, provides a useful relative measure of grain size (grain aspect ratio) in recrystallized non-sag tungsten lamp coils. The method permits rapid measurement, does not require any special instrumentation, and is performed directly from a stage of a standard light microscope. 相似文献
19.
Zhe Li Fei Wu LiJie Zhao Xiao Lin Lan Shen Yi Feng 《Advanced Powder Technology》2018,29(11):2881-2894
Direct compaction (DC) is the preferred choice for tablet manufacturing; however, its application in natural plant product (NPP) tablets is still extremely immature. In this study, NPP powders prepared by three commonly used methods were evaluated on their suitability for DC. Extensive characterizations of their physical properties were performed. Multivariate statistical analysis was utilized to explore the influence of preparation technology on the properties of NPP powders and identify the dominating factors that influence their DC properties. The results demonstrated that (i) the 27 kinds of model NPP powders selected randomly in this study could to some degree represent most NPP powders used in actual production; (ii) ~81.5% of the NPP powders exhibited both poor compactibility and flowability, and none of the NPP powders could be compacted into tablets via DC; (iii) the physical properties of NPP powders prepared by direct pulverization were significantly different from those of extracted ones, while there were no significant differences between the water and ethylalcohol extracted ones; and (iv) the DC properties of NPP powders could be improved through controlling some physical properties (e.g., density, particle size, morphology, and texture parameters) reasonably. Overall, this study comprehensively evaluated the current status and application of NPP powders in DC, and is significant in facilitating the development and modernization of NPPs through DC. 相似文献