首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The problem of comparing two proportions in a 2 x 2 matched-pairs design with binary responses is considered. We consider one-sided null and alternative hypotheses. The problem has two nuisance parameters. Using the monotonicity of the multinomial distribution, four exact unconditional tests based on p-values are proposed by reducing the dimension of the nuisance parameter space from two to one in computation. The size and power of the four exact tests and two other tests, the exact conditional binomial test and the asymptotic McNemar's test, are considered. It is shown that the tests based on the confidence interval p-value are more powerful than the tests based on the standard p-value. In addition, it is found that the exact conditional binomial test is conservative and not powerful for testing the hypothesis. Moreover, the asymptotic McNemar's test is shown to have incorrect size; that is, its size is larger than the nominal level of the test. Overall, the test based on McNemar's statistic and the confidence interval p-value is found to be the most powerful test with the correct size among the tests in this comparison.  相似文献   

2.
Testing for any significant interaction between two variables depends on the number of replicates in each cell of the two-way table and structure of the interaction. If there is interaction between two factors model of observations include interaction term and is called 'non-additive model' which makes interaction and non-additivity equivalent in terms of meaning. When there are several observations taken at each level combination of two variables, testing non-additivity can easily be done by usual two-way ANOVA method which cannot be used when there is only one observation per cell. For the cases with only one observation per cell, some methods have been developed starting with Tukey's one-degree-of-freedom test in which interaction is supposed to be the product of two factor's effects. There are other methods which are used for different structures of interaction when there is only one observation. In this paper, we review some of these tests. After presenting general methodology for the two-factor linear model with interaction effect and the general two-way ANOVA method when there are n > 1 observations per cell, we present some methods for testing non-additivity when there is only one observation per cell. Finally, we illustrate these methods on examples.  相似文献   

3.
We propose a new method for computing power and sample size for linear rank tests of differences between two ordered multinomial populations. The method is flexible in that it is applicable to any general alternative hypothesis and for any choice of rank scores. We show that the method, though asymptotic, closely approximates existing exact methods. At the same time it overcomes the computational limitations of the exact methods. This advantage makes our asymptotic approach more practical for sample size computations at the planning stages of a large study. We illustrate the method with data arising from both proportional and non-proportional odds models in the two ordered multinomial setting.  相似文献   

4.
Abstract

In this paper a unified methodological approach to sequential testing of many composite hypotheses and multi-decision change-point detection for composite alternatives is proposed. New performance measures for methods of hypotheses testing and change-point detection are introduced. Theoretical lower bounds for these performance measures are proved that do not depend on methods of sequential testing and detection. Minimax tests are proposed for which these lower bounds are attained asympototically as decision thresholds tend to infinity. Results of Monte Carlo experiments are given.  相似文献   

5.
6.
In this note, several aspects of a recently proposed specification test in nonparametric models driven by an absolutely regular process are discussed. In particular, we give a more detailed asymptotic analysis of tests based on kernel methods under fixed alternatives using a central limit theorem for U-statistics with n-dependent nondegenerate kernel. As a by-product, it is demonstrated that several results regarding the asymptotic distribution or goodness-of-fit tests are incorrectly stated in the literature. Our result also indicates that results on the asymptotic equivalence of nonparametric autoregression and nonparametric regression cannot be used for the asymptotic analysis of goodness-of-fit tests under fixed alternatives.  相似文献   

7.
Models and computational methods for the sudden release of pressure in gas/vapour-liquid reaction systems. Design of pressure-release sections in safety installations such as safety discs or safety valves for gas/vapour-liquid reaction systems is essentially determined by thermodynamic and fluid dynamic processes as well as by the occurrence of parallel chemical reactions in the reaction vessels and in closed blow-off lines. However, because of the highly complex interrelations and, in part, insufficient detailed knowledge, an exact calculation frequently causes difficulties. In order to present and evaluate the current state of the art, the individual phases of pressure release and the most important influencing factors as well as their significance for the pressure distribution as a function of time are described first. There follows a comparison of several computational methods taken from the literature, but which because they treat of the processes involved in a simple way either only permit very imprecise design of pressure-release sections or which because of their specific model assumptions only apply to particular chemical reaction systems with special boundary conditions. Finally, the problems associated with the development of a generally applicable, reliable computational method are briefly discussed.  相似文献   

8.
All models of this paper involve R x C contingency tables in which the total frequency is fixed (full multinomial model), or in which the row totals are fixed (product multinomial model). For the most part, we assume that the column categories are ordered. For the full multinomial model the null hypothesis of interest is independence, i.e., the (ij)th cell probability is the product of the marginal probabilities of the ith row and jth column. In the product multinomial model the null hypothesis is that the R multinomial distributions have the same vector of cell probabilities. Our review includes (1) a careful listing of two-sided and one-sided alternatives, and (2) methodology to reduce the loss of efficiency of tests because of the discreteness of the model (The methodologies discussed are efficient in several senses. Tests are exact. Tests have very favorable and robust power properties. Tests make use of back-up statistics, thereby providing a finer grid of p-values. In some special cases, e.g., a 2 x C table and a one-sided alternative, conditional p-values are found, within seconds, simply by entering row frequencies into a given website. Thus, computational efficiency is exceptional.), and (3) a critique of some exact linear permutation tests (that are conditional on row and column margins) for both two-sided and some one-sided alternatives. Furthermore, recommendations as to which tests to use for specific alternatives are made.  相似文献   

9.
A number of fading sources have been examined with particular reference to the testing of materials of high light fastness. This work indicates that several light sources may be suitable for carrying out fastness tests on textiles, and anreconomical lamp is available.  相似文献   

10.
'Exact' methods for categorical data are exact in terms of using probability distributions that do not depend on unknown parameters. However, they are conservative inferentially. The actual error probabilities for tests and confidence intervals are bounded above by the nominal level. This article examines the conservatism for interval estimation and describes ways of reducing it. We illustrate for confidence intervals for several basic parameters, including the binomial parameter, the difference between two binomial parameters for independent samples, and the odds ratio and relative risk. Less conservative behavior results from devices such as (1) inverting tests using statistics that are 'less discrete', (2) inverting a single two-sided test rather than two separate one-sided tests each having size at least half the nominal level, (3) using unconditional rather than conditional methods (where appropriate) and (4) inverting tests using alternative p-values. The article concludes with recommendations for selecting an interval in three situations-when one needs to guarantee a lower bound on a coverage probability, when it is sufficient to have actual coverage probability near the nominal level, and when teaching in a classroom or consulting environment.  相似文献   

11.
Thick, high impedance organic coatings are those class of coatings used to provide corrosion protection to naval vessels, pipelines, gasoline storage tanks, and other large structures such as bridges and plant structures. These coatings, especially the newest generations now being used in practice, can provide exceptional protection and lifetime of performance such that properly and accurately assessing and differentiating among competing coatings is a very difficult task. The standard protocol of salt fog testing (ASTM B117), immersion testing, and outdoor exposure in a corrosive environment with subjective evaluation of a coating's performance durings and after testing, does not adequatcly rank and predict coating lifetimes for new coating systems, especially for the environmentally compliant coating systems such as powder coatings (especially the thick, fusion bonded epoxy (FBE) coatings used for pipelines), two component epoxy and urethane coatings and waterborne coatings. New, objective test methods are desperately needed by users and manufacturers of coatings. A relatively new electrochemical test procedure, electrochemical noise methods (ENM), as developed by Skerry and Eden, has been shown in our laboratory to be very successful in the ranking and prediction of relative coating performance. We have used the method successfully on naval ship coatings, several pipeline coatings and other related systems, and Skerry has used them successfully on industrial maintenance coatings. We have used these methods in conjunction with electrochemical impedance spectroscopy, d.c. resistance measurements and cyclic salt fog testing of the ProhesionTM type. In our studies of pipeline coatings, we needed to investigate thermal effects because of their extended range of use temperature. In these studies, we have discovered that electrochemical methods can be used for an in situ measurement of the Tg of coatings in electrolyte immersion. Further, the ‘plasticizing’ effect of aqueous electrolyte absorption as well as its relative irreversibility has been shown. For all coatings studied, ENM provided useful, objective, numerical data which rapidly ranks coatings and provides useful information on the relative lifetime prediction of coatings which may provide up to 30 years of service.  相似文献   

12.
Tensile strength of fine-grained soils has been extensively investigated by earlier researchers and several methodologies have been evolved for its determination. However, either most of these methods are not valid/applicable for a wide range of moisture contents or they involve tedious sample/specimen preparation. In this context, the methodology of determining tensile strength by employing thin films, which is available in the literature, has been found to be quite handy and useful. It has been observed that a unique relationship exists among the tensile strength, moisture content, and shrinkage characteristics of fine-grained soils. This methodology is appreciable due to its applicability to a wide range of moisture contents, comparable ease of sample preparation and testing, and the obtained results lack generalization. Exhaustive tests were conducted on fine-grained soils of entirely different characteristics and generalized relationships have been proposed between the percentage linear shrinkage, tensile strength, and moisture content (defined as liquid to solid ratio). Based on a critical analysis of the results available in the literature, the efficiency of such relationships for determination of tensile strength of fine-grained soils has been demonstrated. In the authors’ opinion, such relationships would be quite useful for determining tensile strength of fine-grained soils from their linear shrinkage, which can easily be measured in a conventional geotechnical engineering laboratory.  相似文献   

13.
Abstract

A general problem of testing two simple hypotheses about the distribution of a discrete-time stochastic process is considered. The main goal is to minimize an average sample number over all sequential tests whose error probabilities do not exceed some prescribed levels. As a criterion of minimization, the average sample number under a third hypothesis is used (modified Kiefer–Weiss problem). For a class of sequential testing problems, the structure of optimal sequential tests is characterized. An application to the Kiefer–Weiss problem for discrete-time stochastic processes is proposed. As another application, the structure of Bayes sequential tests for two composite hypotheses, with a fixed cost per observation, is given. The results are also applied for finding optimal sequential tests for discrete-time Markov processes. In a particular case of testing two simple hypotheses about a location parameter of an autoregressive process of order 1, it is shown that the sequential probability ratio test has the Wald–Wolfowitz optimality property.  相似文献   

14.
We review methods for analysing the performance of several diagnostic tests when patients must be classified as having a disease or not, when no gold standard is available. For latent class analysis (LCA) to provide consistent estimates of sensitivity, specificity and prevalence, traditionally 'independent errors conditional on disease status' have been assumed. Recent approaches derive estimators under more flexible assumptions. However, all likelihood-based approaches suffer from the sparseness of tables generated by this type of data; an issue which is often ignored. In light of this, we examine the potential and limitations of LCAs of diagnostic tests. We are guided by a data set of visceral leishmaniasis tests. In the example, LCA estimates suggest that the traditional reference test, parasitology, has poor sensitivity and underestimates prevalence. From a technical standpoint, including more test results in one analysis yields increasing degrees of sparseness in the table which are seen to lead to discordant values of asymptotically equivalent test statistics and eventually lack of convergence of the LCA algorithm. We suggest some strategies to cope with this.  相似文献   

15.
16.
The thermal behaviour of a multipass process gas heater has been investigated using the zone method of analysis, with the aim of estimating the accuracy of previous predictions based on an extended two-flux model of the radiation field. Process gas, furnace gas, tube surface and refractory surface temperature distributions predicted by the two methods are found to agree well with each other and with the limited experimental data available for the heater. Good agreement is also found between the radiative flux densities calculated using the two-flux model and the zone method. Due to the well established accuracy of the zone method of analysis, the closeness of the two sets of predictions indicates that the two-flux model should prove extremely useful in assessing the thermal effects of changes in design or operating conditions in process gas heaters, particularly because of its computational simplicity compared with the zone method.  相似文献   

17.
A number of decentralized and distributed control schemes based on model predictive control (MPC) have been introduced in the last years. They have been proposed as viable solutions to the computational, transmission and robustness issues arising in the centralized context in case of large-scale and/or distributed plants. Such MPC-based control schemes are very heterogeneous, based on different model structures and realizations, with different features and infrastructural/memory/computational requirements.In this paper, we test and compare, with a realistic case study, a robust non-cooperative scheme and a cooperative iterative one. The main scope is to analyze and unravel, in a fair comparison scenario, these methods from different viewpoints, spanning from the model realization issues to the communication and computational requirements, to the control performances. The benchmark case study consists of an existing natural gas refrigeration plant. Realistic simulations and validation tests are obtained through in the DynSim industrial process simulation environment.  相似文献   

18.
Developing new, more effective antibiotics against resistant Mycobacterium tuberculosis that inhibit its essential proteins is an appealing strategy for combating the global tuberculosis (TB) epidemic. Finding a compound that can target a particular cavity in a protein and interrupt its enzymatic activity is the crucial objective of drug design and discovery. Such a compound is then subjected to different tests, including clinical trials, to study its effectiveness against the pathogen in the host. In recent times, new techniques, which involve computational and analytical methods, enhanced the chances of drug development, as opposed to traditional drug design methods, which are laborious and time-consuming. The computational techniques in drug design have been improved with a new generation of software used to develop and optimize active compounds that can be used in future chemotherapeutic development to combat global tuberculosis resistance. This review provides an overview of the evolution of tuberculosis resistance, existing drug management, and the design of new anti-tuberculosis drugs developed based on the contributions of computational techniques. Also, we show an appraisal of available software and databases on computational drug design with an insight into the application of this software and databases in the development of anti-tubercular drugs. The review features a perspective involving machine learning, artificial intelligence, quantum computing, and CRISPR combination with available computational techniques as a prospective pathway to design new anti-tubercular drugs to combat resistant tuberculosis.  相似文献   

19.
Drug discovery is a cost and time-intensive process that is often assisted by computational methods, such as virtual screening, to speed up and guide the design of new compounds. For many years, machine learning methods have been successfully applied in the context of computer-aided drug discovery. Recently, thanks to the rise of novel technologies as well as the increasing amount of available chemical and bioactivity data, deep learning has gained a tremendous impact in rational active compound discovery. Herein, recent applications and developments of machine learning, with a focus on deep learning, in virtual screening for active compound design are reviewed. This includes introducing different compound and protein encodings, deep learning techniques as well as frequently used bioactivity and benchmark data sets for model training and testing. Finally, the present state-of-the-art, including the current challenges and emerging problems, are examined and discussed.  相似文献   

20.
Owing to mathematical coupling, statistical analyses relating change to baseline values using correlation or regression are erroneous, where the statistical procedure of testing the null hypothesis becomes invalid. Alternatives, such as Oldham's method and the variance ratio test, have been advocated, although these are limited in the presence of measurement errors with non-constant variance. Furthermore, such methods prohibit the consideration of additional covariates (e.g., treatment group within trials) or confounders (e.g., age and gender). This study illustrates the more sophisticated approach of multilevel modelling (MLM) which overcomes these limitations and provides a comprehensive solution to the analysis of change with respect to baseline values. Although mathematical coupling is widespread throughout applied research, one particular area where several studies have suggested a strong relationship between baseline disease severity and treatment effect is guided tissue regeneration (GTR) within dental research. For illustration, we use GTR studies where the original data were found to be available in the literature for reanalysis. We contrast the results from an MLM approach and Oldham's method with the standard (incorrect) approach that suffers from mathematical coupling. MLM provides a robust solution when relating change to baseline and is capable of simultaneously dealing with complex error structures and additional covariates and/or potential confounders.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号