共查询到20条相似文献,搜索用时 78 毫秒
1.
基于粗糙集的区间值属性决策表的有序规则获取 总被引:3,自引:0,他引:3
提出一种基于粗糙集的区间值属性决策表的有序规则获取方法。首先根据区间数之间基于可能度的序关系,将区间值属性决策表转化为二元决策表,然后利用粗糙集理论进行分析并推理出最优规则,最后再将二元决策表的规则转化为区间值属性决策表的有序规则。实验分析表明了该方法的有效性。 相似文献
2.
为解决经典粗糙集理论在处理连续、离散混合属性决策表离散化时规则数多、准确率低的问题,采用基于贪心算法和属性值区间概率相结合的离散化方法,该方法针对传统的对混合决策表仅考虑连续属性离散化的问题。首先运用改进的贪心算法对混合决策表中的连续属性进行初步离散化,然后计算连续属性各属性值区间概率,并对取值概率大的区间细化,最后再考虑对原来的离散属性进一步离散化,从而增强系统分辨能力;且离散化后的决策表总是相容的,与目前很多离散方法不考虑决策相容性相比,该方法能够最大限度地保留系统的有用信息。通过仿真分析验证了该方法的有效性。 相似文献
3.
连续属性离散化在数据分析的数据预处理中非常重要。本文提出一种基于类信息熵的有监督连续属性离散化方法。该方法运用了粗集理论中决策表的一致性水平的概念。算法分成两部分:首先根据决策表的一致性水平动态调整聚类类别数目,运用分级聚类形成初始聚类。然后,基于类信息熵合并相邻区域,减少区间数目。实践证明该方法是可行的。 相似文献
4.
实际应用中存在大量动态增加的区间型数据,若采用传统的非增量正域属性约简方法进行约简,则需要对更新后的区间值数据集的正域约简进行重新计算,导致属性约简的计算效率大大降低。针对上述问题,提出区间值决策表的正域增量属性约简方法。首先,给出区间值决策表正域约简的相关概念;然后,讨论并证明单增量和组增量的正域更新机制,提出区间值决策表的正域单增量和组增量属性约简算法;最后,通过8组UCI数据集进行实验。当8组数据集的数据量由60%增加至100%时,传统非增量属性约简算法在8组数据集中的约简耗时分别为36.59 s、72.35 s、69.83 s、154.29 s、80.66 s、1498.11 s、4124.14 s和809.65 s,单增量属性约简算法的约简耗时分别为19.05 s、46.54 s、26.98 s、26.12 s、34.02 s、1270.87 s、1598.78 s和408.65 s,组增量属性约简算法的约简耗时分别为6.39 s、15.66 s、3.44 s、15.06 s、8.02 s、167.12 s、180.88 s和61.04 s。实验结果表明,提出的区间值决策表的正域增量式属性约简算法具有高效性。 相似文献
5.
6.
7.
考虑到模糊信息系统的不完备性和信息值的不确定性,讨论了不完备区间值模糊信息系统的粗糙集理论,给出了粗糙近似算子的性质。研究了不完备区间值模糊信息系统上的知识发现,提出了基于不完备区间值决策表的决策规则和属性约简,最后给出算例。 相似文献
8.
9.
10.
连续不确定决策表可视为一种多值表元决策表。利用Fuzzy集理论可将多值表元决策表转换为带有隶属度的单一表元决策表;并在此基础上,给出了扩展信息表和决策表的定义,提出了对多值表元决策表中决策概念下近似及边界的计算方法,为利用规则推导算法产生知识提供了确定的输入。 相似文献
11.
In this paper, we describe the results of several tests that check the accuracy of numerical computation on the Cray supercomputer in vector and scalar modes. The known tests were modified to identify the critical point where roundings start causing problems. After describing the tests, we present an interval library called libavi.a. It was developed in Fortran 90 on the Cray Y-MP2E supercomputer of UFRGS-Brazil. This library makes interval mathematics accessible to the Cray supercomputers users. It works with real and complex intervals and intervals matrices and vectors. The library allows overloading of operators and functions. It is organized in four modules: real intervals, interval vectors and matrices, complex intervals, and linear algebra applications. 相似文献
12.
Weldon A. Lodwick Oscar A. Jenkins 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2013,17(8):1393-1402
Constrained intervals, intervals as a mapping from [0, 1] to polynomials of degree one (linear functions) with non-negative slopes, and arithmetic on constrained intervals generate a space that turns out to be a cancellative abelian monoid albeit with a richer set of properties than the usual (standard) space of interval arithmetic. This means that not only do we have the classical embedding as developed by H. Radström, S. Markov, and the extension of E. Kaucher but the properties of these polynomials. We study the geometry of the embedding of intervals into a quasilinear space and some of the properties of the mapping of constrained intervals into a space of polynomials. It is assumed that the reader is familiar with the basic notions of interval arithmetic and interval analysis. 相似文献
13.
A framework for rule-base evidential reasoning in the interval setting applied to diagnosing type 2 diabetes 总被引:2,自引:0,他引:2
A new framework for rule-base evidential reasoning in the interval setting is presented. While developing this framework, two collateral problems such as combining and normalizing interval-valued belief structures from different sources and comparing resulting belief intervals, the bounds of which are intervals, arise. The first problem is solved with the use of the so-called “interval extended zero” method. It is shown that interval valued results of the proposed approach to combining and normalizing interval-valued belief structures are enclosed in those obtained by known methods and possess three desirable intuitively obvious properties of normalization procedure defined in the paper. The second problem is solved using the method for interval comparison based on the Demposter-Shafer theory providing the interval valued results of comparison. The advantages of the proposed framework for rule-base evidential reasoning in the interval setting are demonstrated using the developed expert system for diagnosing type 2 diabetes. 相似文献
14.
Skylar Lei Smith M.R. 《IEEE transactions on pattern analysis and machine intelligence》2003,29(11):996-1004
Sample statistics and model parameters can be used to infer the properties, or characteristics, of the underlying population in typical data-analytic situations. Confidence intervals can provide an estimate of the range within which the true value of the statistic lies. A narrow confidence interval implies low variability of the statistic, justifying a strong conclusion made from the analysis. Many statistics used in software metrics analysis do not come with theoretical formulas to allow such accuracy assessment. The Efron bootstrap statistical analysis appears to address this weakness. In this paper, we present an empirical analysis of the reliability of several Efron nonparametric bootstrap methods in assessing the accuracy of sample statistics in the context of software metrics. A brief review on the basic concept of various methods available for the estimation of statistical errors is provided, with the stated advantages of the Efron bootstrap discussed. Validations of several different bootstrap algorithms are performed across basic software metrics in both simulated and industrial software engineering contexts. It was found that the 90 percent confidence intervals for mean, median, and Spearman correlation coefficients were accurately predicted. The 90 percent confidence intervals for the variance and Pearson correlation coefficients were typically underestimated (60-70 percent confidence interval), and those for skewness and kurtosis overestimated (98-100 percent confidence interval). It was found that the Bias-corrected and accelerated bootstrap approach gave the most consistent confidence intervals, but its accuracy depended on the metric examined. A method for correcting the under-/ overestimation of bootstrap confidence intervals for small data sets is suggested, but the success of the approach was found to be inconsistent across the tested metrics. 相似文献
15.
Renata M. C. R. de Souza Diego C. F. Queiroz Francisco José A. Cysneiros 《Pattern Analysis & Applications》2011,14(3):273-282
This paper introduces different pattern classifiers for interval data based on the logistic regression methodology. Four approaches
are considered. These approaches differ according to the way of representing the intervals. The first classifier considers
that each interval is represented by the centres of the intervals and performs a classic logistic regression on the centers
of the intervals. The second one assumes each interval as a pair of quantitative variables and performs a conjoint classic
logistic regression on these variables. The third one considers that each interval is represented by its vertices and a classic
logistic regression on the vertices of the intervals is applied. The last one assumes each interval as a pair of quantitative
variables, performs two separate classic logistic regressions on these variables and combines the results in some appropriate
way. Experiments with synthetic data sets and an application with a real interval data set demonstrate the usefulness of these
classifiers. 相似文献
16.
17.
针对含有n个区间的区间图K-连接最短路径(K-SP)问题,提出一种求解区间图K-SP问题的在线算法。分析区间图及其最短路径问题的特有性质,利用改进的动态规划算法和贪心算法,优化在线算法的时间复杂度。理论分析结果表明,该算法的时间复杂度为O(nK+nlgn),与目前已知最优的离线算法复杂度相同。 相似文献
18.
This paper aims to start exploring the application of interval techniques to deal with robustness issues in the context of predictive control. The robust stability problem is transformed into that of checking the positivity of a rational function. Modal intervals are presented as a useful tool to deal with this kind of function.Modal interval analysis extends real numbers to intervals, identifying the intervals by the predicates that the real numbers fulfill, unlike classical interval analysis which identifies the intervals with the set of real numbers that they contain. Modal interval analysis not only simplifies the computation of interval functions but also allows semantic interpretations of the results. These interpretations are applied to the analysis and design of robust predictive controllers for parametric systems. Necessary, sufficient and, in some cases, necessary and sufficient conditions for robust performance are presented.Specifically, an interval procedure is proposed to compute the stability margin of a predictive control law when applied to a class of plants described by discrete time transfer functions with coefficients that depend polynomially on uncertain parameters. 相似文献
19.
N. A. Gaidamakin S. V. Leont’ev A. A. Yalpaev 《Automatic Documentation and Mathematical Linguistics》2012,46(4):177-182
The features of the expert evaluation of intractable properties (parameters) in the form of interval values on number scales are analyzed. To find a consistent evaluation, two methods for averaging the evaluations in interval form are considered. The first is based on the simple (arithmetical mean) averaging of the interval boundaries and the second is concerned with weighted averaging. It is proposed that the “weighing” of the interval boundaries be implemented according to the interval width according to the following principle: the lower the width of the evaluation interval is, the more qualified the expert evaluation of the property under investigation is and the higher the weight of the boundaries of the corresponding interval under averaging is. When averaging two intervals, an increase in the qualification of a consistent expert evaluation during weighted averaging occurs. 相似文献