首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this study, a new scheme was presented for the prediction of fetal state from fetal heart rate (FHR) and the uterine contraction (UC) signals obtained from cardiotocogram (CTG) recordings. CTG recordings are widely used in pregnancy and provide very valuable information regarding fetal well-being. The information effectively extracted from these recordings can be used to predict pathological state of the fetus and makes an early intervention possible before there is an irreversible damage to the fetus. The proposed scheme is based on adaptive neuro-fuzzy inference systems (ANFIS). Using features extracted from the FHR and UC signals, an ANFIS was trained to predict the normal and the pathological state. The method was tested with clinical data that consist of 1,831 CTG recordings. Out of these 1,831 recordings, 1,655 of them were classified as normal and the remaining 176 were classified as pathological by a consensus of three expert obstetricians. It was demonstrated that the ANFIS-based method was able to classify the normal and the pathologic states with 97.2 and 96.6 % accuracy, respectively.  相似文献   

2.
Fetal heart rate helps in diagnosing the well-being and also the distress of fetal. Cardiotocograph (CTG) monitors the fetal heart activity to estimate the fetal tachogram based on the evaluation of ultrasound pulses reflected from the fetal heart. It consists in a simultaneous recording and analysis of fetal heart rate signal, uterine contraction activity and fetal movements. Generally CTG comprises more number of features. Feature selection also called as attribute selection is a process of selecting a subset of highly relevant features which is responsible for future analysis. In general, medical datasets require more number of features to predict an activity. This paper aims at identifying the relevant and ignores the redundant features, consequently reducing the number of features to assess the fetal heart rate. The features are selected by using unsupervised particle swarm optimization (PSO)-based relative reduct (US-PSO-RR) and compared with unsupervised relative reduct and principal component analysis. The proposed method is then tested by applying various classification algorithms such as single decision tree, multilayer perceptron neural network, probabilistic neural network and random forest for maximum number of classes and clustering accuracies like root mean square error, mean absolute error, Davies–Bouldin index and Xie–Beni index for minimum number of classes. Empirical results show that the US-PSO-RR feature selection technique outperforms the existing methods by producing sensitivity of 72.72 %, specificity of 97.66 %, F-measure of 74.19 % which is remarkable, and clustering results demonstrate error rate produced by US-PSO-RR is less as well.  相似文献   

3.
解亚萍 《计算机应用》2011,31(5):1409-1412
很多数据挖掘方法只能处理离散值的属性,因此,连续属性必须进行离散化。提出一种统计相关系数的数据离散化方法,基于统计相关理论有效地捕获了类-属性间的相互依赖,选取最佳断点。此外,将变精度粗糙集(VPRS)模型纳入离散化中,有效地控制数据的信息丢失。将所提方法在乳腺癌症诊断以及其他领域数据上进行了应用,实验结果表明,该方法显著地提高了See5决策树的分类学习精度。  相似文献   

4.
如何对生产环境中经代码混淆的结构化数据集的敏感属性(字段)进行自动化识别、分类分级,已成为对结构化数据隐私保护的瓶颈。提出一种面向结构化数据集的敏感属性自动化识别与分级算法,利用信息熵定义了属性敏感度,通过对敏感度聚类和属性间关联规则挖掘,将任意结构化数据集的敏感属性进行识别和敏感度量化;通过对敏感属性簇中属性间的互信息相关性和关联规则分析,对敏感属性进行分组并量化其平均敏感度,实现敏感属性的分类分级。实验表明,该算法可识别、分类、分级任意结构化数据集的敏感属性,效率和精确率更高;对比分析表明,该算法可同时实现敏感属性的识别与分级,无须预知属性特征、敏感特征字典,兼顾了属性间的相关性和关联关系。  相似文献   

5.
The recognition of accelerative and decelerative patterns in the fetal heart rate (FHR) is one of the tasks carried out manually by obstetricians when they analyze cardiotocograms for information respecting the fetal state. An approach based on artificial neural networks formed by a multilayer perceptron (MLP) is developed. However, since the system utilizes the FHR signal as direct input, an anterior stage must be incorporated that applies a principal component analysis (PCA) so as to make the system independent of the signal baseline. Furthermore, the introduction of multiresolution into the PCA has resolved other problems that were detected in the application of the system. Presented in this paper are the results of validation of these systems designated the PCA-MLP and multiresolutlon principal component analysis (MR-PCA) systems against three clinical experts.  相似文献   

6.
提出了基于主成分分析(Principal Component Analysis,PCA)的K近邻(K Nearest Neighbor,KNN)分类原理,并将其应用于胎心率与宫缩描记图分类。主要思想是:对训练样本和测试样本进行降维,并对降维后的测试样本使用KNN分类技术分类。选择2 120组胎心率与宫缩描记图数据,使用该方法进行分类测试。实验结果表明,使用该类模型,分类结果稳定,分类准确率高,并且能够降低高维空间搜索K近邻的复杂性,减轻计算负担。  相似文献   

7.
The paper presents an overview of the 15 year long development of fetal phonocardiography including the works on the applied signal processing methods for identification of sound components. Based on the improvements achieved on this field, the paper shows that beyond the traditional CTG test the phonocardiography may be successfully applied for long-term fetal measurements and home monitoring. In addition, by indication of heart murmurs based on a comprehensive analysis of the recorded heart sound congenital heart defects can also be detected together with additional features in the third trimester. This makes an early widespread screening possible combined with the prescribed CTG test even at home using a telemedicine system.  相似文献   

8.
Databases store large amounts of information about consumer transactions and other kinds of transactions. This information can be used to deduce rules about consumer behavior, and the rules can in turn be used to determine company policies, for instance with regards to production, marketing and in several other areas. Since databases typically store millions of records, and each record could have up to 100 or more attributes, as an initial step it is necessary to reduce the size of the database by eliminating attributes that do not influence the decision at all or do so very minimally. In this paper we present techniques that can be employed effectively for exact and approximate reduction in a database system. These techniques can be implemented efficiently in a database system using SQL (structured query language) commands. We tested their performance on a real data set and validated them. The results showed that the classification performance actually improved with a reduced set of attributes as compared to the case when all the attributes were present. We also discuss how our techniques differ from statistical methods and other data reduction methods such as rough sets.  相似文献   

9.
Cardiotocography is the primary method for biophysical assessment of fetal state, which is mainly based on the recording and analysis of fetal heart rate (FHR) signal. Computerized systems for fetal monitoring provide a quantitative analysis of FHR signals, however the effective methods of qualitative assessment that could support the process of medical diagnosis are still needed. The measurements of hydronium ions concentration (pH) in neonatal cord blood are an objective indicator of the fetal outcome. Improper pH level is a symptom of acidemia being the result of fetal hypoxia. The paper proposes a two-step analysis of fetal heart rate recordings that allows for effective prediction of the acidemia risk. The first step consists in fuzzy classification of FHR signals. Fuzzy inference corresponds to the clinical interpretation of signals based on the FIGO guidelines. The goal of inference is to eliminate recordings indicating the fetal wellbeing from the further classification process. In the second step, the remained recordings are nonlinearly classified using multilayer perceptron and Lagrangian Support Vector Machines (LSVM). The proposed procedures are evaluated using data collected with computerized fetal surveillance system. The assessment performance is evaluated with the number of correct classifications (CC) and quality index (QI) defined as the geometric mean of sensitivity and specificity. The highest CC = 92.0% and QI = 88.2% were achieved for the Weighted Fuzzy Scoring System combined with the LSVM algorithm. The obtained results confirm the efficacy of the proposed methods of computerized analysis of FHR signals in the evaluation of the risk of neonatal acidemia.  相似文献   

10.
牟琦  毕孝儒  厍向阳 《计算机工程》2011,37(14):103-105
高维网络数据中的无关属性和冗余属性容易使分类算法的网络入侵检测速度变慢、检测率降低。为此,提出一种基于遗传量子粒子群优化(GQPSO)算法的网络入侵特征选择方法,该方法将遗传算法中的选择变异策略与QPSO有机结合形成GQPSO算法,并以网络数据属性之间的归一化互信息量作为该算法适应度函数,指导其对网络数据的属性约简,实现网络入侵特征子集的优化选择。在KDDCUP1999数据集上进行仿真实验,结果表明,与QPSO算法、PSO算法相比,该方法能更有效地精简网络数据特征,提高分类算法的网络入侵检测速度及检测率。  相似文献   

11.
: Cardiotocography (CTG) represents the fetus’s health inside the womb during labor. However, assessment of its readings can be a highly subjective process depending on the expertise of the obstetrician. Digital signals from fetal monitors acquire parameters (i.e., fetal heart rate, contractions, acceleration). Objective:: This paper aims to classify the CTG readings containing imbalanced healthy, suspected, and pathological fetus readings. Method:: We perform two sets of experiments. Firstly, we employ five classifiers: Random Forest (RF), Adaptive Boosting (AdaBoost), Categorical Boosting (CatBoost), Extreme Gradient Boosting (XGBoost), and Light Gradient Boosting Machine (LGBM) without over-sampling to classify CTG readings into three categories: healthy, suspected, and pathological. Secondly, we employ an ensemble of the above-described classifiers with the over-sampling method. We use a random over-sampling technique to balance CTG records to train the ensemble models. We use 3602 CTG readings to train the ensemble classifiers and 1201 records to evaluate them. The outcomes of these classifiers are then fed into the soft voting classifier to obtain the most accurate results. Results:: Each classifier evaluates accuracy, Precision, Recall, F1-scores, and Area Under the Receiver Operating Curve (AUROC) values. Results reveal that the XGBoost, LGBM, and CatBoost classifiers yielded 99% accuracy. Conclusion:: Using ensemble classifiers over a balanced CTG dataset improves the detection accuracy compared to the previous studies and our first experiment. A soft voting classifier then eliminates the weakness of one individual classifier to yield superior performance of the overall model.  相似文献   

12.
在已有的多种决策树测试属性选择方法中,未见将属性值遗漏数据处理集成在测试属性选择过程中的报道, 而现有的属性值遗漏数据处理方法都会不同程度地带入偏置。基于此,提出了一种将基于联合墒的信息增益率作为 决策树测试属性选择标准的方法,用以在生成决策树的过程中消除值遗漏数据对测试属性选择的影响。在WEKA机 器平台上进行了对比实验,结果表明,改进算法能够从总体上提高算法的执行效率和分类精度。  相似文献   

13.
The aim of the present study is (1) to evaluate the performances of two series of European Remote Sensing (ERS) Synthetic Aperture Radar (SAR) images for land cover classification of a Mediterranean landscape (Minorca, Spain), compared with multispectral information from Système Pour l'Observation de la Terre (SPOT) and Landsat Thematic Mapper (TM) sensors, and (2) to test the synergy of SAR and optical data with a fusion method based on the Demspter–Shafer evidence theory, which is designed to deal with imprecise information. We have evaluated as a first step the contribution of multitemporal ERS data and contextual methods of classification, with and without filtering, for the discrimination of vegetation types. The present study shows the importance of time series of the ERS sensor and of the vectorial MMSE (minimum mean square error) filter based on segmentation for land cover classification. Fifteen land cover classes were discriminated (eight concerning different vegetation types) with a mean producer's accuracy of 0.81 for a five-date time series within 1998, and of 0.71 for another four-date time series for 1994/1995. These results are comparable to those from SPOT XS images: 0.69 for July, 0.67 for October (0.85 for July plus October), and also from TM data (0.81). These results are corroborated by the kappa coefficient of agreement. The fusion between the 1994 series of ERS and XS (July), based on a derived method of the Dempster–Shafer evidence theory, shows a slight improvement on overall accuracies: +0.06 of mean producer's accuracy and +0.04 of kappa coefficient.  相似文献   

14.
目的 超声医师手动探查与采集胎儿心脏切面图像时,常因频繁的手动暂停与截图操作而错失心脏切面最佳获取时机。而单纯采用深层视觉目标检测或分类网络自动获取切面时,因无法确保网络重点关注切面图像中相对较小的心脏区域的细粒度特征,导致高误检率;另外,不同的心脏解剖部件的最佳成像时刻也常常不同步。针对上述问题,提出一种目标检测与分类网络相结合,同时融合关键帧间时序关系的标准四腔心(four-chamber,4CH)切面图像自动获取算法。方法 首先,利用自行构建的胎儿心脏超声切面数据集训练目标检测网络,实现四腔心区域和降主动脉区域的快速准确定位。接着,当检测到在一定时间窗内的视频帧存在降主动脉区域时,将包含四腔心目标的候选区域提取后送入利用自建的标准四腔心区域图像集训练好的分类网络,进一步分类出标准四腔心区域。最后,通过时序关系确定出可靠的降主动脉区域,将可靠降主动脉的检测置信度及同一时间窗内各个切面图像中四腔心区域在分类模型中的输出,加权计算得到标准四腔心切面图像的得分。结果 采用本文构建的数据集训练的YOLOv5x(you only look once version 5 extra large...  相似文献   

15.

Image texture can be an important source of data in the image classification process. Although not as easily measurable as image spectral attributes, image texture has proved in a number of cases to be a valuable source of data capable of increasing the accuracy of the classification process. In remote sensing there are cases in which classes are spectrally very similar, but present distinct spatial distribution, i.e. different textural characteristics. Image texture becomes then an important source of information in the classification process. The aim of this study is (1) to develop and test a supervised image classification method based on the image spatial texture as extracted by the Gabor filtering concept and (2) to investigate experimentally the performance of the classification process as a function of the Gabor filter's parameters. A set of Gabor filters is initially generated for the given image data. The filter parameters related to the relevant spatial frequencies present in the image are estimated from the available samples via the Fourier transform. Each filter generates one filtered image which characterizes the particular spatial frequency implemented by the filter parameters. As a result, a number of filtered images, sometimes referred to as 'textural bands', are generated and the originally univariate problem is transformed into a multivariate one, every pixel being defined by a vector with dimension identical to the number of filters used. The multidimensional image data can then be classified by implementing an appropriate supervised classification method. In this study the Euclidean Minimum Distance and the Gaussian Maximum Likelihood classifiers are used. The adequacy of the selected Gabor filter parameters (namely, the spatial frequency and the filter's spatial extent) are then examined as a function of the resulting classification accuracy. The proposed supervised methodology is tested using both synthetic and real image data. Results are presented and analysed.  相似文献   

16.
We explore a utilization of Boolean matrix factorization for data preprocessing in classification of Boolean data. In our previous work, we demonstrated that preprocessing that consists in replacing the original Boolean attributes by factors, i.e. new Boolean attributes obtained from the original ones by Boolean matrix factorization, can improve classification quality. The aim of this paper is to explore the question of how the various Boolean factorization methods that were proposed in the literature impact the quality of classification. In particular, we compare five factorization methods, present experimental results, and outline issues for future research.  相似文献   

17.
李文秀  李金宝 《电脑学习》2011,(3):13-16,26
母婴监护系统现已成为产科产前、产时行之有效的监护手段,能通过胎心率曲线诊断胎儿的健康状况,对提高出生人口素质,减少残疾婴儿的出生具有重要作用。传统的母婴监护系统体积较大、操作比较复杂、实时性差、价格比较昂贵。因而设计了一种智能母婴监护系统,根据嵌入到无线智能传感器中的算法提取出稳定的胎心率,将胎心率以及孕妇其它监护数据以无线的方式发送到本地或远程监护中心,本地监护设备或远程的监护中心将各项数据进行实时处理、分析、显示或报警,辅助医生或将信息及时反馈回来通知孕妇进行诊治。  相似文献   

18.
针对新能源智能车监控数据中包含过多的连续属性,提出了一种基于分辨矩阵和信息增益率的有监督离散化算法,从而降低连续属性的取值精度,使得新能源智能车后续的分类模型建立更具泛化能力.该算法在保证分类效果的前提下,获得尽可能少的结果断点,主要从3个方面对传统的离散化算法进行优化,一是根据决策表的条件属性与决策属性构建候选断点分辨矩阵,通过分辨矩阵判断相邻属性取值之间是否有可能的断点;二是用信息增益率来优化结果断点的选取;三是通过设定停止阈值解决了传统算法因停止条件过于严格导致算法选取过多的结果断点、离散化效果一般的问题.实验结果表明,改进的算法能够有效减少断点数量,大幅提高计算效率,并获得与经典算法相近的离散结果.  相似文献   

19.
全局与局部判别信息融合的转子故障数据集降维方法研究   总被引:1,自引:0,他引:1  
针对传统的数据降维方法无法兼顾保持全局特征信息与局部判别信息的问题,提出一种核主元分析(Kernel principal component analysis,KPCA)和正交化局部敏感判别分析(Orthogonal locality sensitive discriminant analysis,OLSDA)相结合的转子故障数据集降维方法.该方法首先利用KPCA算法有效降低数据集的相关性、消除冗余属性,由此实现了最大程度地保留原始数据全局非线性信息的作用;然后利用OLSDA算法充分挖掘出数据的局部流形结构信息,达到了提取出具有高判别力低维本质特征的目的.上述方法的特点是通过同时进行的正交化处理可避免局部子空间结构发生失真,采用三维图直观显示出低维结果,以低维特征子集输入最近邻分类器(K-nearest neighbor,KNN)的识别率和聚类分析之类间距Sb、类内距Sw作为衡量降维效果的指标.实验表明该方法能够全面地提取出全局与局部判别信息,使故障分类更清晰,相应地识别准确率得到了明显提升.该研究可为解决高维和非线性机械故障数据集的可视化与分类问题,提供理论参考依据.  相似文献   

20.
刘星毅 《微机发展》2008,18(5):70-72
分类问题是数据挖掘和机器学习中的一个核心问题。为了得到最大程度的分类准确率,决策树分类过程中,非常关键的是结点分裂属性的选择。常见的分裂结点属性选择方法可以分为信息熵方法、GINI系数方法等。分析了目前常见的选择分裂属性方法——基于信息熵方法的优、缺点,提出了基于卡方检验的决策树分裂属性的选择方法,用真实例子和设置模拟实验说明了文中算法的优越性。实验结果显示文中算法在分类错误率方面好于以信息熵为基础的方法。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号