首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The Lindley distribution applied to competing risks lifetime data   总被引:1,自引:0,他引:1  
Competing risks data usually arises in studies in which the death or failure of an individual or an item may be classified into one of k ≥ 2 mutually exclusive causes. In this paper a simple competing risks distribution is proposed as a possible alternative to the Exponential or Weibull distributions usually considered in lifetime data analysis. We consider the case when the competing risks have a Lindley distribution. Also, we assume that the competing events are uncorrelated and that each subject can experience only one type of event at any particular time.  相似文献   

2.
In reliability analysis, accelerated life-testing allows for gradual increment of stress levels on test units during an experiment. In a special class of accelerated life tests known as step-stress tests, the stress levels increase discretely at pre-fixed time points, and this allows the experimenter to obtain information on the parameters of the lifetime distributions more quickly than under normal operating conditions. Moreover, when a test unit fails, there are often more than one fatal cause for the failure, such as mechanical or electrical. In this article, we consider the simple step-stress model under time constraint when the lifetime distributions of the different risk factors are independently exponentially distributed. Under this setup, we derive the maximum likelihood estimators (MLEs) of the unknown mean parameters of the different causes under the assumption of a cumulative exposure model. Since it is found that the MLEs do not exist when there is no failure by any particular risk factor within the specified time frame, the exact sampling distributions of the MLEs are derived through the use of conditional moment generating functions. Using these exact distributions as well as the asymptotic distributions, the parametric bootstrap method, and the Bayesian posterior distribution, we discuss the construction of confidence intervals and credible intervals for the parameters. Their performance is assessed through Monte Carlo simulations and finally, we illustrate the methods of inference discussed here with an example.  相似文献   

3.
The duration of a software project is a very important feature, closely related to its cost. Various methods and models have been proposed in order to predict not only the cost of a software project but also its duration. Since duration is essentially the random length of a time interval from a starting to a terminating event, in this paper we present a framework of statistical tools, appropriate for studying and modeling the distribution of the duration. The idea for our approach comes from the parallelism of duration to the life of an entity which is frequently studied in biostatistics by a certain statistical methodology known as survival analysis. This type of analysis offers great flexibility in modeling the duration and in computing various statistics useful for inference and estimation. As in any other statistical methodology, the approach is based on datasets of measurements on projects. However, one of the most important advantages is that we can use in our data information not only from completed projects, but also from ongoing projects. In this paper we present the general principles of the methodology for a comprehensive duration analysis and we also illustrate it with applications to known data sets. The analysis showed that duration is affected by various factors such as customer participation, use of tools, software logical complexity, user requirements volatility and staff tool skills.
Ioannis StamelosEmail:
  相似文献   

4.
Past studies in the IT outsourcing area have examined the management of IT outsourcing relationships from a variety of perspectives. The present paper extends this line of research. In this study, we take a multi-theoretic perspective to explore factors that determine the duration of continuing IT outsourcing relationships between vendor and client firms. Five ex-ante and two ex-post factors that may influence relationship duration were examined in this study. Data for this study were collected using a nationwide survey. To investigate the dynamics of continuing outsourcing relationships through repetitive contracts, we performed survival analysis using an accelerated failure-time (AFT) model. Four factors are found to have a significant relationship with relationship duration as hypothesized. However, three factors, of which two are ex-post factors, are found to not have a significant impact on outsourcing relationship duration. Implications and contributions of the study are discussed.  相似文献   

5.
Load restraint systems in automobile transport utilise tie-down lashings placed over the car’s tyres, which are tensioned manually by the operator using a ratchet assembly. This process has been identified as a significant manual handling injury risk. The aim of this study was to gain insight on the current practices associated with tie-down lashings operation, and identify the gaps between current and optimal practice. We approached this with qualitative and quantitative assessments and one numerical simulation to establish: (i) insight into the factors involved in ratcheting; (ii) the required tension to hold the car on the trailer; and (iii) the tension achieved by drivers in practice and associated joint loads. We identified that the method recommended to the drivers was not used in practice. Drivers instead tensioned the straps to the maximum of their capability, leading to over-tensioning and mechanical overload at the shoulder and elbow. We identified the postures and strategies that resulted in the lowest loads on the upper body during ratcheting (using both hands and performing the task with their full body). This research marks the first step towards the development of a training programme aiming at changing practice to reduce injury risks associated with the operation of tie-down lashings in the automobile transport industry.

Practitioner Summary: The study investigated current practice associated with the operation of tie-down lashings through qualitative (interviews) and quantitative (biomechanical analysis) methods. Operators tended to systematically over-tension the lashings and consequently overexert, increasing injury risks.  相似文献   


6.
A BASIC program is presented for the calculation of the complete temperature variation of mineral-growth rates based on partial data. The algorithm is derived from a corresponding states equation for crystal growth, together with a compensation relationship in the standard Arrhenius equation of growth rate vs temperature.  相似文献   

7.
Ricardo C.L.F.  Pedro L.D.   《Automatica》2009,45(11):2620-2626
This paper investigates the problems of robust stability analysis and state feedback control design for discrete-time linear systems with time-varying parameters. It is assumed that the time-varying parameters lie inside a polytopic domain and have known bounds on their rate of variation. A convex model is proposed to represent the parameters and their variations as a polytope and linear matrix inequality relaxations that take into account the bounds on the rates of parameter variations are proposed. A feasible solution provides a parameter-dependent Lyapunov function with polynomial dependence on the parameters assuring the robust stability of this class of systems. Extensions to deal with robust control design as well as gain-scheduling by state feedback are also provided in terms of linear matrix inequalities. Numerical examples illustrate the results.  相似文献   

8.
This paper considers the estimation of the volatility of the instantaneous short interest rate from a new perspective. Rather than using discretely compounded market rates as a proxy for the instantaneous short rate of interest, we derive a relationship between observed LIBOR rates and certain unobserved instantaneous forward rates. We determine the stochastic dynamics for these rates under the risk-neutral measure and propose a filtering estimation algorithm for a time-discretised version of the resulting interest rate dynamics based on dynamic Bayesian updating in order to estimate the volatility function. Our time discretisation can be justified by the fact that data are observed discretely in time. The method is applied to US Treasury rates of various maturities to compute a (posterior) distribution for the parameters of the volatility specification.  相似文献   

9.
The paper presents nonlinear averaging theorems for two-time scale systems, where the dynamics of the fast system are allowed to vary with the slow system. The results are applied to the Narendra-Valavani adaptive control algorithm, and estimates of the parameter convergence rates are obtained which do not rely on a linearization of the system around the equilibrium, and therefore are valid in a larger region in the parameter space.  相似文献   

10.
In this note, the problems of stability analysis and controller synthesis of Markovian jump systems with time‐varying delay and partially known transition rates are investigated via an input–output approach. First, the system under consideration is transformed into an interconnected system, and new results on stochastic scaled small‐gain condition for stochastic interconnected systems are established, which are crucial for the problems considered in this paper. Based on the system transformation and the stochastic scaled small‐gain theorem, stochastic stability of the original system is examined via the stochastic version of the bounded realness of the transformed forward system. The merit of the proposed approach lies in its reduced conservatism, which is made possible by a precise approximation of the time‐varying delay and the new result on the stochastic scaled small‐gain theorem. The proposed stability condition is demonstrated to be much less conservative than most existing results. Moreover, the problem of stabilization is further solved with an admissible controller designed via convex optimizations, whose effectiveness is also illustrated via numerical examples. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

11.
This paper focuses on university-level education offered by methods of distance learning in the field of computers and aims at the investigation of the main causes for student dropouts. The presented study is based on the students of the Course of “Informatics”, Faculty of Science and Technology of the Hellenic Open University and investigates the particularities of education provided through the use of computers and technology in general. This paper presents information about the students' profile, the use of computer technology, the percentage of dropouts, as well as a classification of the reasons for dropouts based on interviews with the students. The study shows that dropouts are correlated with the use of technological means and, based on this fact, the Hellenic Open University implemented interventions in the use of such means. It also proves that a correlation exists between dropouts and students' age, but not gender, although female students are more reluctant to start following a course. However, it is also shown that female students' commitment to a course is stronger and thus, they do not drop out as easily as male students do. Furthermore, the results of this study strongly correlate dropouts to the existence of previous education in the field of Informatics or to working with computers, but not to the degree of specialisation in computers. Finally, the paper presents the reasons provided by the students for drooping out, with the main reasons being the inability to estimate the time required for university-level studies and the perceived difficulty of the computers course.  相似文献   

12.
This paper presents a network model of financial flow of funds accounting which can be used to calculate reconciled values of all outstanding financial instruments, tangible assets, and net worth. The model captures the accounting identities which must hold and permits the estimation of sector holdings of both assets and liabilities, as well as total outstanding instrument volumes. A decomposition algorithm is then developed which resolves the network problem into simpler subproblems proach of which can then be solved exactly and in closed form. A theoretical analysis of the algorithm is also given. Finally, the algorithm is applied to a dataset derived from the Federal Reserve Board data for 1989. The balanced dataset can then be used as a baseline for an empirical general equilibrium model.This research was supported by cooperative agreement No. 58-3AEN-0-80066 from the USDA. This support is gratefully acknowledged.  相似文献   

13.
An efficient estimate for the change point in the hazard function is obtained. This is based on a Bayesian estimator which uses equations concerning the parameters of a recently proposed hazard function. It is found through a simulation study that the proposed estimator is more efficient than the traditional estimators in many cases. Furthermore, experimental results that use data of breast cancer patients and some lymphoma data show that the proposed estimator is also practical in applications.  相似文献   

14.
The area under the ROC curve, or AUC, has been widely used to assess the ranking performance of binary scoring classifiers. Given a sample, the metric considers the ordering of positive and negative instances, i.e., the sign of the corresponding score differences. From a model evaluation and selection point of view, it may appear unreasonable to ignore the absolute value of these differences. For this reason, several variants of the AUC metric that take score differences into account have recently been proposed. In this paper, we present a unified framework for these metrics and provide a formal analysis. We conjecture that, despite their intuitive appeal, actually none of the variants is effective, at least with regard to model evaluation and selection. An extensive empirical analysis corroborates this conjecture. Our findings also shed light on recent research dealing with the construction of AUC-optimizing classifiers.  相似文献   

15.
A method is proposed for obtaining combinations of factors derived from a factor analysis characterized by a large number of near-zero loadings relative to the original variables. It differs from rotation of factors, which replaces r factors by an equal number, r, of differently oriented factors, in that each solution consists of a single direction in F variable space.  相似文献   

16.
ContextIn software project management, the distribution of resources to various project activities is one of the most challenging problems since it affects team productivity, product quality and project constraints related to budget and scheduling.ObjectiveThe study aims to (a) reveal the high complexity of modelling the effort usage proportion in different phases as well as the divergence from various rules-of-thumb in related literature, and (b) present a systematic data analysis framework, able to offer better interpretations and visualisation of the effort distributed in specific phases.MethodThe basis for the proposed multivariate statistical framework is Compositional Data Analysis, a methodology appropriate for proportions, along with other methods like the deviation from rules-of-thumb, the cluster analysis and the analysis of variance. The effort allocations to phases, as reported in around 1500 software projects of the ISBSG R11 repository, were transformed to vectors of proportions of the total effort and were analysed with respect to prime project attributes.ResultsThe proposed statistical framework was able to detect high dispersion among data, distribution inequality and various interesting correlations and trends, groupings and outliers, especially with respect to other categorical and continuous project attributes. Only a very small number of projects were found close to the rules-of-thumb from the related literature. Significant differences in the proportion of effort spent in different phrases for different types of projects were found.ConclusionThere is no simple model for the effort allocated to phases of software projects. The data from previous projects can provide valuable information regarding the distribution of the effort for various types of projects, through analysis with multivariate statistical methodologies. The proposed statistical framework is generic and can be easily applied in a similar sense to any dataset containing effort allocation to phases.  相似文献   

17.
Multivariate failure time data is commonly encountered in biomedicine, because each study subject may experience multiple events or because there exists clustering of subjects such that failure times within the same cluster are correlated. MULCOX2 implements a general statistical methodology for analyzing such data. This approach formulates the marginal distributions of multivariate failure times by Cox proportional hazards models without specifying the nature of dependence among related failure times. The baseline hazard functions for the marginal models may be identical or different. A variety of statistical inference can be made regarding the effects of (possibly time-dependent) covariates on the failure rates. Although designed primarily for the marginal approach, MULCOX2 is general enough to implement several alternative methods. The program runs on any computer with a FORTRAN compiler. The running time is minimal. Two illustrative examples are provided.  相似文献   

18.
IT managers in global firms often rely on user evaluations to guide their decision-making in adopting, implementing, and monitoring the effectiveness of enterprise systems across national cultures. In these decisions, managers need instruments that provide valid comparisons across cultures. Using samples representing five nations/world regions including the US, Western Europe, Saudi Arabia, India, and Taiwan, we used multi-group invariance analysis to evaluate whether the end-user computing satisfaction (EUCS) instrument (12-item summed scale and five factors) provided equivalent measurement across cultures. The results provided evidence that the EUCS instrument's 12-item scale and the five factors were equivalent across the cultures we examined. The implications of this for the global management of technology are discussed. Knowledge of the equivalence of MIS instruments across national cultures can enhance the MIS cross-cultural research agenda.  相似文献   

19.
A Bayesian analysis of the thermal challenge problem   总被引:1,自引:0,他引:1  
A major question for the application of computer models is Does the computer model adequately represent reality? Viewing the computer models as a potentially biased representation of reality, Bayarri et al. [M. Bayarri, J. Berger, R. Paulo, J. Sacks, J. Cafeo, J. Cavendish, C. Lin, J. Tu, A framework for validation of computer models, Technometrics 49 (2) (2007) 138–154] develop the simulator assessment and validation engine (SAVE) method as a general framework for answering this question. In this paper, we apply the SAVE method to the challenge problem which involves a thermal computer model designed for certain devices. We develop a statement of confidence that the devices modeled can be applied in intended situations.  相似文献   

20.
M. Woodman  D. C. Ince 《Software》1985,15(11):1057-1072
This paper describes a portable software tool used for the processing and maintenance of data flow diagrams which form the basis of structured analysis techniques. The tool itself is based on the idea that data flow diagrams can be modelled by means of semantic nets and can be manipulated by a semantic net processor. A major feature of the tool is the facilities it provides for the maintenance programmer.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号