共查询到20条相似文献,搜索用时 15 毫秒
1.
Poisson and negative binomial (NB) models have been used to analyze traffic accident occurrence at intersections for several years. There are however, limitations in the use of such models. The Poisson model requires the variance-to-mean ratio of the accident data to be about 1. Both the Poisson and the NB models require the accident data to be uncorrelated in time. Due to unobserved heterogeneity and serial correlation in the accident data, both models seem to be inappropriate. A more suitable alternative is the random effect negative binomial (RENB) model, which by treating the data in a time-series cross-section panel, will be able to deal with the spatial and temporal effects in the data. This paper describes the use of RENB model to identify the elements that affect intersection safety. To establish the suitability of the model, several goodness-of-fit statistics are used. The model is then applied to investigate the relationship between accident occurrence and the geometric, traffic and control characteristics of signalized intersections in Singapore. The results showed that 11 variables significantly affected the safety at the intersections. The total approach volumes, the numbers of phases per cycle, the uncontrolled left-turn lane and the presence of a surveillance camera are among the variables that are the highly significant. 相似文献
2.
In this paper, two-state Markov switching models are proposed to study accident frequencies. These models assume that there are two unobserved states of roadway safety, and that roadway entities (roadway segments) can switch between these states over time. The states are distinct, in the sense that in the different states accident frequencies are generated by separate counting processes (by separate Poisson or negative binomial processes). To demonstrate the applicability of the approach presented herein, two-state Markov switching negative binomial models are estimated using five-year accident frequencies on Indiana interstate highway segments. Bayesian inference methods and Markov Chain Monte Carlo (MCMC) simulations are used for model estimation. The estimated Markov switching models result in a superior statistical fit relative to the standard (single-state) negative binomial model. It is found that the more frequent state is safer and it is correlated with better weather conditions. The less frequent state is found to be less safe and to be correlated with adverse weather conditions. 相似文献
3.
Angelo Efoévi Koudou 《TEST》1998,7(1):95-110
The so-called Lancaster probabilities on R2 are a class of distributions satisfying an orthogonality condition involving orthogonal polynomials with respect to their
marginal laws. They are characterized in the cases where the two identical margins are Gaussian, gamma (the latter are known
results, but a new treatment is given), Poisson or negative binomial distributions. Some partial results are obtained in the
cases of two different Poisson or negative binomial margins, and also in the case where one margin is gamma and the other
margin is negative binomial. 相似文献
4.
Univariate integer-valued time series models, including integer-valued autoregressive (INAR) models and integer-valued generalized autoregressive conditional heteroscedastic (INGARCH) models, have been well studied in the literature, but there is little progress in multivariate models. Although some multivariate INAR models were proposed, they do not provide enough flexibility in modeling count data, such as volatility of numbers of stock transactions. Then, a bivariate Poisson INGARCH model was suggested by Liu (Some models for time series of counts, Dissertations, Columbia University, 2012), but it can only deal with positive cross-correlation between two components. To remedy this defect, we propose a new bivariate Poisson INGARCH model, which is more flexible and allows for positive or negative cross-correlation. Stationarity and ergodicity of the new process are established. The maximum likelihood method is used to estimate the unknown parameters, and consistency and asymptotic normality for estimators are given. A simulation study is given to evaluate the estimators for parameters of interest. Real and artificial data examples are illustrated to demonstrate good performances of the proposed model relative to the existing model. 相似文献
5.
In this paper, we propose a Bayesian method for modelling count data by Poisson, binomial or negative binomial distributions.
These three distributions have in common that the variance is, at most, a quadratic function of the mean. We use prior distributions
on the variance function coefficients to consider simultaneously the three possible models and decide which one fits the data
better. This approach sheds new light on the analysis of the Sibship data (Sokal and Rohlf, 1987). The Jeffreys-Lindley paradox
is discussed through some illustrations. 相似文献
6.
The usual assumption of a Poisson model for the number of chromosome aberrations in controlled calibration experiments implies variance equal to the mean. However, it is known that chromosome aberration data from experiments involving high linear energy transfer radiations can be overdispersed, i.e. the variance is greater than the mean. Present methods for dealing with overdispersed chromosome data rely on frequentist statistical techniques. In this paper. the problem of overdispersion is considered from a Bayesian standpoint. The Bayes Factor is used to compare Poisson and negative binomial models for two previously published calibration data sets describing the induction of dicentric chromosome aberrations by high doses of neutrons. Posterior densities for the model parameters, which characterise dose response and overdispersion are calculated and graphed. Calibrative densities are derived for unknown neutron doses from hypothetical radiation accident data to deterimine the impact of different model assumptions on dose estimates. The main conclusion is that an initial assumption of a negative binomial model is the conservative approach to chromosome dosimetry for high LET radiations. 相似文献
7.
Chunjiao Dong David B. Clarke Xuedong Yan Asad Khattak Baoshan Huang 《Accident; analysis and prevention》2014
Crash data are collected through police reports and integrated with road inventory data for further analysis. Integrated police reports and inventory data yield correlated multivariate data for roadway entities (e.g., segments or intersections). Analysis of such data reveals important relationships that can help focus on high-risk situations and coming up with safety countermeasures. To understand relationships between crash frequencies and associated variables, while taking full advantage of the available data, multivariate random-parameters models are appropriate since they can simultaneously consider the correlation among the specific crash types and account for unobserved heterogeneity. However, a key issue that arises with correlated multivariate data is the number of crash-free samples increases, as crash counts have many categories. In this paper, we describe a multivariate random-parameters zero-inflated negative binomial (MRZINB) regression model for jointly modeling crash counts. The full Bayesian method is employed to estimate the model parameters. Crash frequencies at urban signalized intersections in Tennessee are analyzed. The paper investigates the performance of MZINB and MRZINB regression models in establishing the relationship between crash frequencies, pavement conditions, traffic factors, and geometric design features of roadway intersections. Compared to the MZINB model, the MRZINB model identifies additional statistically significant factors and provides better goodness of fit in developing the relationships. The empirical results show that MRZINB model possesses most of the desirable statistical properties in terms of its ability to accommodate unobserved heterogeneity and excess zero counts in correlated data. Notably, in the random-parameters MZINB model, the estimated parameters vary significantly across intersections for different crash types. 相似文献
8.
9.
The aim of many occupational safety interventions is to reduce the incidence of injury. However, when measuring intervention effectiveness within a period, population-based accident count data typically contain a large proportion of zero observations (no injury). This situation is compounded where injuries are categorized in a binary manner according to an outcome of interest. The distribution thus comprises a point mass at zero mixed with a non-degenerate parametric component, such as the bivariate Poisson. In this paper, a bivariate zero-inflated Poisson (BZIP) regression model is proposed to evaluate a participatory ergonomics team intervention conducted within the cleaning services department of a public teaching hospital. The findings highlight that the BZIP distribution provided a satisfactory fit to the data, and that the intervention was associated with a significant reduction in overall injury incidence and the mean number of musculoskeletal (MLTI) injuries, while the decline in injuries of a non-musculoskeletal (NMLTI) nature was marginal. In general, the method can be applied to assess the effectiveness of intervention trials on other populations at high risk of occupational injury. 相似文献
10.
Crash data can often be characterized by over-dispersion, heavy (long) tail and many observations with the value zero. Over the last few years, a small number of researchers have started developing and applying novel and innovative multi-parameter models to analyze such data. These multi-parameter models have been proposed for overcoming the limitations of the traditional negative binomial (NB) model, which cannot handle this kind of data efficiently. The research documented in this paper continues the work related to multi-parameter models. The objective of this paper is to document the development and application of a flexible NB generalized linear model with randomly distributed mixed effects characterized by the Dirichlet process (NB-DP) to model crash data. The objective of the study was accomplished using two datasets. The new model was compared to the NB and the recently introduced model based on the mixture of the NB and Lindley (NB-L) distributions. Overall, the research study shows that the NB-DP model offers a better performance than the NB model once data are over-dispersed and have a heavy tail. The NB-DP performed better than the NB-L when the dataset has a heavy tail, but a smaller percentage of zeros. However, both models performed similarly when the dataset contained a large amount of zeros. In addition to a greater flexibility, the NB-DP provides a clustering by-product that allows the safety analyst to better understand the characteristics of the data, such as the identification of outliers and sources of dispersion. 相似文献
11.
A chemical kinetics model to explain the abrasive size effect on chemical mechanical polishing 总被引:1,自引:0,他引:1
A chemical kinetics model was proposed to describe the abrasive size effect on chemical mechanical polishing (CMP). The model is based on the consideration of a pad as a sort of catalyst and the re-adhering of abrasives due to the large size. Therefore, a general equation was deduced according the chemical kinetics methodology to give the meanings of the size effect. Finally, according a set of data related to the abrasive size effect on CMP, a possible form can be PR=αCCCTXACWAn/[β+γXACWAn] where α, β, γ and n are the parameters in a CMP system. 相似文献
12.
A stochastic catastrophe model using two-fluid model parameters to investigate traffic safety on urban arterials 总被引:1,自引:0,他引:1
During the last few decades, the two-fluid model and its two parameters have been widely used in transportation engineering to represent the quality of operational traffic service on urban arterials. Catastrophe models have also often been used to describe traffic flow on freeway sections. This paper demonstrates the possibility of developing a pro-active network screening tool that estimates the crash rate using a stochastic cusp catastrophe model with the two-fluid model's parameters as inputs. The paper investigates the analogy in logic behind the two-fluid model and the catastrophe model using straightforward graphical illustrations. The paper then demonstrates the application of two-fluid model parameters to a stochastic catastrophe model designed to estimate the level of safety on urban arterials. Current road safety management, including network safety screening, is post-active rather than pro-active in the sense that an existing hotspot must be identified before a safety improvement program can be implemented. This paper suggests that a stochastic catastrophe model can help us to become more pro-active by helping us to identify urban arterials that currently show an acceptable level of safety, but which are vulnerable to turning into crash hotspots. We would then be able to implement remedial actions before hotspots develop. 相似文献
13.
A model of traffic crashes in New Zealand 总被引:3,自引:0,他引:3
The aim of this study was to examine the changes in the trend and seasonal patterns in fatal crashes in New Zealand in relation to changes in economic conditions between 1970 and 1994. The Harvey and Durbin (Journal of the Royal Statistical Society 149 (3) (1986) 187-227) structural time series model (STSM), an 'unobserved components' class of model, was used to estimate models for quarterly fatal traffic crashes. The dependent variable was modelled as the number of crashes and three variants of the crash rate (crashes per 10,000 km travelled, crashes per 1,000 vehicles, and crashes per 1000 population). Independent variables included in the models were unemployment rate (UER), real gross domestic product per capita, the proportion of motorcycles, the proportion of young males in the population, alcohol consumption per capita, the open road speed limit, and dummy variables for the 1973 and 1979 oil crises and seat belt wearing laws. UERs, real GDP per capita, and alcohol consumption were all significant and important factors in explaining the short-run dynamics of the models. In the long-run, real GDP per capita was directly related to the number of crashes but after controlling for distance travelled was not significant. This suggests increases in income are associated with a short-run reduction in risk but increases in exposure to a crash (i.e. distance travelled) in the long-run. A 1% increase in the open road speed limit was associated with a long-run 0.5% increase in fatal crashes. Substantial reductions in fatal crashes were associated with the 1979 oil crisis and seat belt wearing laws. The 1984 universal seat belt wearing law was associated with a sustained 15.6% reduction in fatal crashes. These road policy factors appeared to have a greater influence on crashes than the role of demographic and economic factors. 相似文献
14.
MacNab YC 《Accident; analysis and prevention》2003,35(1):91-102
This article presents a recent study which applies Bayesian hierarchical methodology to model and analyse accident and injury surveillance data. A hierarchical Poisson random effects spatio-temporal model is introduced and an analysis of inter-regional variations and regional trends in hospitalisations due to motor vehicle accident injuries to boys aged 0-24 in the province of British Columbia, Canada, is presented. The objective of this article is to illustrate how the modelling technique can be implemented as part of an accident and injury surveillance and prevention system where transportation and/or health authorities may routinely examine accidents, injuries, and hospitalisations to target high-risk regions for prevention programs, to evaluate prevention strategies, and to assist in health planning and resource allocation. The innovation of the methodology is its ability to uncover and highlight important underlying structure of the data. Between 1987 and 1996, British Columbia hospital separation registry registered 10,599 motor vehicle traffic injury related hospitalisations among boys aged 0-24 who resided in British Columbia, of which majority (89%) of the injuries occurred to boys aged 15-24. The injuries were aggregated by three age groups (0-4, 5-14, and 15-24), 20 health regions (based of place-of-residence), and 10 calendar years (1987 to 1996) and the corresponding mid-year population estimates were used as 'at risk' population. An empirical Bayes inference technique using penalised quasi-likelihood estimation was implemented to model both rates and counts, with spline smoothing accommodating non-linear temporal effects. The results show that (a) crude rates and ratios at health region level are unstable, (b) the models with spline smoothing enable us to explore possible shapes of injury trends at both the provincial level and the regional level, and (c) the fitted models provide a wealth of information about the patterns (both over space and time) of the injury counts, rates and ratios. During the 10-year period, high injury risk ratios evolved from northwest to central-interior and the southeast [corrected]. 相似文献
15.
A utility maximization model of driver traffic safety behavior 总被引:1,自引:1,他引:0
G Blomquist 《Accident; analysis and prevention》1986,18(5):371-375
A simple utility maximization model is presented to illustrate that risk compensation is a natural part of human behavior when individuals pursue multiple goals with limited resources. In this positive economic model driver safety effort is determined by a balance between reduced risk and increased disutility cost. Changes which affect the balance induce drivers to change their own safety efforts. Under plausible conditions a change in exogenous safety, which is beyond driver control, causes a compensatory change in driver effort in the opposite direction. A sample of special seat belt use studies illustratively indicates the usefulness of the model. 相似文献
16.
The data analysed consists of the joint distribution of severities of injury to vehicle drivers in head-on crashes, stratified according to the relative masses of the vehicles. On the basis of some fairly strong assumptions, a model is developed which results in the joint distribution being bivariate normal. The parameters are interpretable in terms of the effect of velocity change on injury severity, and the relative variability of velocity change and of injury severity at a particular velocity change. The predictions made by the model enjoy a considerable degree of success. 相似文献
17.
18.
In 2006, we carried out a cross-sectional study in the urban area of Pelotas, Southern Brazil, with the aim of outlining the profile of bicycle commuters, analyzing their use of safety equipment and risk behaviors and the association between these variables and involvement in traffic accidents in the previous 12 months. This study was based on the baseline survey carried out prior to an educational intervention aimed at reducing accidents among cyclists. The sample included 1133 male subjects aged 20 years or more, and who used a bicycle for commuting. Crude and adjusted analyses were carried out using Poisson regression. We recorded a total of 152 reported traffic accidents in the 12 months preceding the interview, involving 10.8% of subjects. Most risk behaviors studied and the use of safety equipment showed no significant association with accidents. Only commuting by bicycle seven days per week, as opposed to five or six, and a combination of extremely imprudent behaviors such as zigzagging through traffic, riding after ingesting alcohol, and high-speed riding were found to be risk factors for accidents. Our findings suggest that in the context where the study was done (poor road signaling, limited policing, aggressive driving) changing cyclist behavior may not have substantial impact in terms of accident reduction before other road traffic interventions are implemented. 相似文献
19.
In this paper, a bivariate replacement policy (n, T) for a cumulative shock damage process is presented that included the concept of cumulative repair cost limit. The arrival shocks can be divided into two kinds of shocks. Each type-I shock causes a random amount of damage and these damages are additive. When the total damage exceeds a failure level, the system goes into serious failure. Type-II shock causes the system into minor failure and such a failure can be corrected by minimal repair. When a minor failure occurs, the repair cost will be evaluated and minimal repair is executed if the accumulated repair cost is less than a predetermined limit L. The system is replaced at scheduled time T, at n-th minor failure, or at serious failure. The long-term expected cost per unit time is derived using the expected costs as the optimality criterion. The minimum-cost policy is derived, and existence and uniqueness of the optimal \( n^{*} \) and \( T^{*} \) are proved. This bivariate optimal replacement policy (n, T) is showed to be better than the optimal \( T^{*} \) and the optimal \( n^{*} \) policy. 相似文献
20.
I. K. Ravichandra Rao 《Scientometrics》1998,41(1-2):93-100
In his book on “Documentation”, Bradford derived the law of scattering, based on algebric explanation with the supposition
that n1=n2=n. n1 and n2 are computed based on average no. of articles per journals in the first three zones. An analysis of a small sample of 12
data sets, using t-test suggests that it is unlikely that n1=n2. Further an attempt has been made to identify a suitable model to explain the law of scattering; among the various models
tried, log-normal fits much better than many models including the log-linear model. 相似文献