首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper we derive and analyze some a posteriori error estimators for the stabilized P1 nonconforming approximation of the Stokes problem involving the strain tensor. This will be done by decomposing the numerical error in a proper way into conforming and nonconforming contributions. The error estimator for the nonconforming error is obtained in the standard way, and the implicit error estimator for the conforming error is derived by applying the equilibrated residual method. A crucial part of this work is construction of approximate normal stresses on interelement boundaries which will serve as equilibrated Neumann data for local Stokes problems. It turns out that such normal stresses can be simply computed by local weak residuals of the discrete system plus jumps of the velocity solution and that a stronger equilibration condition is satisfied to ensure solvability of local Stokes problems. We also derive a simple explicit error estimator based on the nonsymmetric tensor recovery of the normal stress error. Numerical results are provided to illustrate the performance of our error estimators.  相似文献   

2.
In covariance structure analysis, alternative methods of estimation are now regularly available. A variety of statistics, such as estimators, test statistics, and residuals, are computed. The sampling variability of these statistics is known to depend on a matrix Γ which is based on the fourth-order moments of the data. Estimates of these fourth-order moments are expensive to compute, require a lot of computer storage, and have high sampling variability in small to moderate samples. By exploiting the linear relations that typically generate the covariance structure, we have developed conditions under which a matrix Γ*, which depends only on second-order moments of the data, can be used as a substitute for Γ to obtain correct asymptotic distributions for the statistics of interest. In contrast to related work on asymptotic robustness in covariance structure analysis, our theory is developed in the general setting of arbitrary discrepancy functions and addresses a broader class of statistics that include, for instance, goodness of fit statistics that are not necessarily asymptotically χ2 distributed, and statistics based on the residuals. Basically, our theory shows that the normal theory form Γ* for Γ can be used whenever an independence assumption (not only uncorrelatedness), which will always hold under normality, carries over to the model with nonnormal variables. This theory is spelled out in sufficient detail and simplicity so that it can be used in every day practice.  相似文献   

3.
In this paper, we solve a cell identification problem in a femtocell environment. Wireless communication standards specify physical cell identifiers (PCIs) for user entities (UEs) to discriminate data from different cells and for networks to identify cells. However, in the densely deployed area, the number of PCIs for femtocells may not be sufficient. The scarcity of PCIs results in identification ambiguity due to duplicated PCIs in a certain area, which may result in handover failure. Using time synchronization between femtocells and macro-cells, we propose a scheme to efficiently identify cells in a femtocell environment. The proposed scheme defines the virtual cell identifier (vID) of a femtocell as the system frame number offset of the femtocell with respect to a macro-cell and differentiates femtocells with combinations of vIDs and PCIs. Our scheme can increase the number of cell identifiers up to 1024 times using even coarse local time synchronization without changing the existing physical layer specification. The improvement in cell identification resolution reduces redundant message transactions, resulting in network performance enhancement. Since our scheme demands only local time synchronization between adjacent cells, it can be applied to asynchronous as well as synchronous wireless systems. We verify the proposed scheme via simulations in various environments.  相似文献   

4.
针对收益率服从非正态分布的风险资产建立限制卖空的均值-VaR投资组合模型,与马克维兹的均值-方差投资组合模型及收益率服从正态分布的均值-VaR投资组合模型进行比较分析。应用实例显示均值-VaR投资组合模型的投资效果优于均值-方差投资组合模型,基于非正态分布收益率的均值-VaR模型的投资效果略优于基于正态分布收益率的均值-VaR模型。  相似文献   

5.
In this paper, two mortar versions of the so-called projection nonconforming and the mixed element methods are proposed, respectively, for nonselfadjoint and indefinite second-order elliptic problems. It is proven that the mortar mixed element method is equivalent to the mortar projection nonconforming element method. Based on this equivalence, the existence, uniqueness, and uniform convergence of the solution for mortar mixed element method are shown only under minimal regularity assumption. Meanwhile, the optimal error estimate is obtained under certain regularity assumption. Furthermore, an additive Schwarz preconditioning method is proposed for solving the discrete problem and the nearly optimal convergence rate for the preconditioned GMRES method is proven under minimal regularity assumption. Finally, the practical implementation of the method is adderssed and numerical experiments are presented.  相似文献   

6.
7.
We give an a posteriori error estimator for low order nonconforming finite element approximations of diffusion-reaction and Stokes problems, which relies on the solution of local problems on stars. It is proved to be equivalent to the energy error up to a data oscillation, without requiring Helmholtz decomposition of the error nor saturation assumption. Numerical experiments illustrate the good behavior and efficiency of this estimator for generic elliptic problems.  相似文献   

8.
The linear discriminant analysis (LDA) is a linear classifier which has proven to be powerful and competitive compared to the main state-of-the-art classifiers. However, the LDA algorithm assumes the sample vectors of each class are generated from underlying multivariate normal distributions of common covariance matrix with different means (i.e., homoscedastic data). This assumption has restricted the use of LDA considerably. Over the years, authors have defined several extensions to the basic formulation of LDA. One such method is the heteroscedastic LDA (HLDA) which is proposed to address the heteroscedasticity problem. Another method is the nonparametric DA (NDA) where the normality assumption is relaxed. In this paper, we propose a novel Bayesian logistic discriminant (BLD) model which can address both normality and heteroscedasticity problems. The normality assumption is relaxed by approximating the underlying distribution of each class with a mixture of Gaussians. Hence, the proposed BLD provides more flexibility and better classification performances than the LDA, HLDA and NDA. A subclass and multinomial versions of the BLD are proposed. The posterior distribution of the BLD model is elegantly approximated by a tractable Gaussian form using variational transformation and Jensen's inequality, allowing a straightforward computation of the weights. An extensive comparison of the BLD to the LDA, support vector machine (SVM), HLDA, NDA and subclass discriminant analysis (SDA), performed on artificial and real data sets, has shown the advantages and superiority of our proposed method. In particular, the experiments on face recognition have clearly shown a significant improvement of the proposed BLD over the LDA.  相似文献   

9.
A general procedure is derived for simulating univariate and multivariate nonnormal distributions using polynomial transformations of order five. The procedure allows for the additional control of the fifth and sixth moments. The ability to control higher moments increases the precision in the approximations of nonnormal distributions and lowers the skew and kurtosis boundary relative to the competing procedures considered. Tabled values of constants are provided for approximating various probability density functions. A numerical example is worked to demonstrate the multivariate procedure. The results of a Monte Carlo simulation are provided to demonstrate that the procedure generates specified population parameters and intercorrelations.  相似文献   

10.
We propose a novel method, called Semi-supervised Projection Clustering in Transfer Learning (SPCTL), where multiple source domains and one target domain are assumed. Traditional semi-supervised projection clustering methods hold the assumption that the data and pairwise constraints are all drawn from the same domain. However, many related data sets with different distributions are available in real applications. The traditional methods thus can not be directly extended to such a scenario. One major challenging issue is how to exploit constraint knowledge from multiple source domains and transfer it to the target domain where all the data are unlabeled. To handle this difficulty, we are motivated to construct a common subspace where the difference in distributions among domains can be reduced. We also invent a transferred centroid regularization, which acts as a bridge to transfer the constraint knowledge to the target domain, to formulate this geometric structure formed by the centroids from different domains. Extensive experiments on both synthetic and benchmark data sets show the effectiveness of our method.  相似文献   

11.
孙鑫  周昆  石教英 《软件学报》2008,19(4):1004-1015
现有的基于预计算的全局光照明绘制算法都假设场景中物体的材质固定不变,这样,从入射光照到出射的辐射亮度之间的传输变换就是线性变换.通过对这种线性变换的预计算,可以在动态光源下实现全局光照明的实时绘制.但是,当材质可以改变时,这种线性变换不再成立,因此,现有算法无法直接用于动态材质的场景.提出了一种方法:在修改场景中的物体材质时,可以实时得到场景在直接光照和间接光照下的绘制效果.将最终到达视点的辐射亮度根据其之前经过的反射次数及相应的反射材质分为多个部分,每个部分和先后反射的材质的乘积成正比,从而把该非线性问题转化为线性问题.又将所有可选的材质都表示为一组基的线性组合.将这组基作为材质赋予场景中的物体,就有各种不同的组合方式,预计算每种组合下所有部分的出射辐射亮度.在绘制时,根据各物体材质投影到基上的系数线性组合预计算的数据就能实时得到最终的全局光照明的绘制结果.该方法适用于几何场景、光照和视点都不发生变化的场景.使用双向反射分布函数来表示物体的材质,不考虑折射或者半透明的情况.该实现最多包含两次反射,并可以实时绘制得到一些很有趣的全局光照明效果,比如渗色、焦散等等.  相似文献   

12.
A parametric regression model for right-censored data with a log-linear median regression function and a transformation in both response and regression parts, named parametric Transform-Both-Sides (TBS) model, is presented. The TBS model has a parameter that handles data asymmetry while allowing various different distributions for the error, as long as they are unimodal symmetric distributions centered at zero. The discussion is focused on the estimation procedure with five important error distributions (normal, double-exponential, Student’s t, Cauchy and logistic) and presents properties, associated functions (that is, survival and hazard functions) and estimation methods based on maximum likelihood and on the Bayesian paradigm. These procedures are implemented in TBSSurvival, an open-source fully documented R package. The use of the package is illustrated and the performance of the model is analyzed using both simulated and real data sets.  相似文献   

13.
The prior distribution of an attribute in a naïve Bayesian classifier is typically assumed to be a Dirichlet distribution, and this is called the Dirichlet assumption. The variables in a Dirichlet random vector can never be positively correlated and must have the same confidence level as measured by normalized variance. Both the generalized Dirichlet and the Liouville distributions include the Dirichlet distribution as a special case. These two multivariate distributions, also defined on the unit simplex, are employed to investigate the impact of the Dirichlet assumption in naïve Bayesian classifiers. We propose methods to construct appropriate generalized Dirichlet and Liouville priors for naïve Bayesian classifiers. Our experimental results on 18 data sets reveal that the generalized Dirichlet distribution has the best performance among the three distribution families. Not only is the Dirichlet assumption inappropriate, but also forcing the variables in a prior to be all positively correlated can deteriorate the performance of the naïve Bayesian classifier.  相似文献   

14.
Provisioning of quality of service (QoS) is the ultimate goal for any wireless sensor network (WSN). Several factors can influence this requirement such as the adopted cluster formation algorithm. Almost all WSNs are structured based on grouping the sensors nodes into clusters. Not all contemporary cluster formation and routing algorithms (e.g. LEACH) were designed to provide/sustain certain QoS requirement such as delay constraint. Another fundamental design issue is that, these algorithms were built and tested under the assumption of uniformly distributed sensor nodes. However, this assumption is not always true. In some industrial applications and due to the scope of the ongoing monitoring process, sensors are installed and condensed in certain areas, while they are widely separated in other areas. Also unlike the random deployment distributions, there are many applications that need deterministic deployment of sensors like grid distribution. In this work, we investigated and characterized the impact of sensor node deployment distributions on the performance of different flavors of LEACH routing algorithm. In particular, we studied via extensive simulation experiments how LEACH cluster formation approach affects the delay (inter and intra-cluster delay) and energy efficiency expressed in terms of packet/joule for different base station locations and data loads. In this study, we consider four deployment distributions: grid, normal, exponential and uniform. The results showed the significant impact of nodes distribution on the network energy efficiency, throughput and delay performance measures. These findings would help the architects of real time application wireless sensor networks such as secure border sensor networks to design such networks to meet its specifications effectively and fulfill their critical mission.  相似文献   

15.
The program inputs any number of whole-rock analyses, with up to 28 prescribed elements determined. These may be in any order, as oxide or element percentages, and may contain missing data. From the oxide weight percentages, correlation, regression and principal component analysis can be performed. Molecular proportions are computed and from the MCIPW norms and Niggli numbers may be calculated. Cation proportions then are computed and Barth's standard cell, basis components, molecular norms and Barth's mesonorms (for oversaturated rocks) may be generated. Line-printer X-Y graphs, X-Y-Z triangular diagrams, or histograms can be generated from any chosen set of parameters. Operation of the program requires no previous computer experience, but the competent user readily could extend the available options.  相似文献   

16.
Remotely sensed multispectral image data are found in grouped form with (say) s spectral components (bands). In this study, a practical method for constructing a mixture model or the probability density function of the mixture of k (3 =< k =< s) normal distributions for a spectral class is given. A new method for estimation of the mixing proportions of spectral components (bands) in the remotely sensed multispectral image data is proposed with the assumption that the spectral component (band) means are different from each other.  相似文献   

17.
Data sets in numerous areas of application can be modelled by symmetric bivariate nonnormal distributions. Estimation of parameters in such situations is considered when the mean and variance of one variable is a linear and a positive function of the other variable. This is typically true of bivariate t distribution. The resulting estimators are found to be remarkably efficient. Hypothesis testing procedures are developed and shown to be robust and powerful. Real life examples are given.  相似文献   

18.
Cartesian moments are frequently used global geometrical features in computer vision for object pose estimation and recognition. We derive a closed form expression for 3-D Cartesian moment of order p+q+r of a superellipsoid in its canonical coordinate system. We also show how 3-D Cartesian moment of a globally deformed superellipsoid in general position and orientation can be computed as a linear combination of 3-D Cartesian moments of the corresponding nondeformed superellipsoid in canonical coordinate system. Additionally, moments of objects that are compositions of superellipsoids can be computed as simple sums of moments of individual parts. To demonstrate practical application of the derived results we register pairs of range images based on moments of recovered compositions of superellipsoids. We use a standard technique to find centers of gravity and principal axes in pairs of range images while third-order moments are used to resolve the four-way ambiguity. Experimental results show expected improvement of recovered rigid transformation based on moments of recovered superellipsoids as compared to the registration based on moments of raw range image data. Besides object pose estimation the presented results can be directly used for object recognition with moments and/or moment invariants as object features.  相似文献   

19.
Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial.  相似文献   

20.
不可"修复如新"的两相依部件的并联系统的可靠性分析   总被引:2,自引:0,他引:2  
研究了两相依部件的并联可修系统,在部件的故障分布服从二维指数分布、两个维修分布 为一般分布、且故障部件不能"修复如新的假设下",利用几何过程和补充变量法求出了该系统 的主要可靠性指标.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号